2026-01-06 00:00:07.591574 | Job console starting 2026-01-06 00:00:07.610191 | Updating git repos 2026-01-06 00:00:07.733243 | Cloning repos into workspace 2026-01-06 00:00:08.091617 | Restoring repo states 2026-01-06 00:00:08.126804 | Merging changes 2026-01-06 00:00:08.126820 | Checking out repos 2026-01-06 00:00:08.532981 | Preparing playbooks 2026-01-06 00:00:09.527870 | Running Ansible setup 2026-01-06 00:00:18.120334 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2026-01-06 00:00:20.864921 | 2026-01-06 00:00:20.865098 | PLAY [Base pre] 2026-01-06 00:00:20.883638 | 2026-01-06 00:00:20.883819 | TASK [Setup log path fact] 2026-01-06 00:00:20.925766 | orchestrator | ok 2026-01-06 00:00:20.952873 | 2026-01-06 00:00:20.953051 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-01-06 00:00:20.996449 | orchestrator | ok 2026-01-06 00:00:21.018474 | 2026-01-06 00:00:21.018627 | TASK [emit-job-header : Print job information] 2026-01-06 00:00:21.136631 | # Job Information 2026-01-06 00:00:21.136842 | Ansible Version: 2.16.14 2026-01-06 00:00:21.136881 | Job: testbed-deploy-next-in-a-nutshell-with-tempest-ubuntu-24.04 2026-01-06 00:00:21.136916 | Pipeline: periodic-midnight 2026-01-06 00:00:21.136940 | Executor: 521e9411259a 2026-01-06 00:00:21.136961 | Triggered by: https://github.com/osism/testbed 2026-01-06 00:00:21.136983 | Event ID: f117ecaaf4f04ac6b8eeb4c2eef15247 2026-01-06 00:00:21.159427 | 2026-01-06 00:00:21.159585 | LOOP [emit-job-header : Print node information] 2026-01-06 00:00:21.540640 | orchestrator | ok: 2026-01-06 00:00:21.540907 | orchestrator | # Node Information 2026-01-06 00:00:21.540946 | orchestrator | Inventory Hostname: orchestrator 2026-01-06 00:00:21.540971 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2026-01-06 00:00:21.540993 | orchestrator | Username: zuul-testbed06 2026-01-06 00:00:21.541014 | orchestrator | Distro: Debian 12.12 2026-01-06 00:00:21.541037 | orchestrator | Provider: static-testbed 2026-01-06 00:00:21.541058 | orchestrator | Region: 2026-01-06 00:00:21.541079 | orchestrator | Label: testbed-orchestrator 2026-01-06 00:00:21.541098 | orchestrator | Product Name: OpenStack Nova 2026-01-06 00:00:21.541118 | orchestrator | Interface IP: 81.163.193.140 2026-01-06 00:00:21.563078 | 2026-01-06 00:00:21.563231 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2026-01-06 00:00:23.430757 | orchestrator -> localhost | changed 2026-01-06 00:00:23.479607 | 2026-01-06 00:00:23.479917 | TASK [log-inventory : Copy ansible inventory to logs dir] 2026-01-06 00:00:27.338532 | orchestrator -> localhost | changed 2026-01-06 00:00:27.373327 | 2026-01-06 00:00:27.373505 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2026-01-06 00:00:28.788683 | orchestrator -> localhost | ok 2026-01-06 00:00:28.796610 | 2026-01-06 00:00:28.796826 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2026-01-06 00:00:28.865431 | orchestrator | ok 2026-01-06 00:00:28.911538 | orchestrator | included: /var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2026-01-06 00:00:28.955459 | 2026-01-06 00:00:28.955608 | TASK [add-build-sshkey : Create Temp SSH key] 2026-01-06 00:00:32.193573 | orchestrator -> localhost | Generating public/private rsa key pair. 2026-01-06 00:00:32.194515 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/41c85c8fe207460eae3f3f1c1a35cf27_id_rsa 2026-01-06 00:00:32.194611 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/41c85c8fe207460eae3f3f1c1a35cf27_id_rsa.pub 2026-01-06 00:00:32.194642 | orchestrator -> localhost | The key fingerprint is: 2026-01-06 00:00:32.194668 | orchestrator -> localhost | SHA256:LHNbNfxyA2cuUCeYW6HUQx7EI+9t6NYwzh0qpKRHWBo zuul-build-sshkey 2026-01-06 00:00:32.194692 | orchestrator -> localhost | The key's randomart image is: 2026-01-06 00:00:32.194728 | orchestrator -> localhost | +---[RSA 3072]----+ 2026-01-06 00:00:32.194751 | orchestrator -> localhost | | .BB.. | 2026-01-06 00:00:32.194775 | orchestrator -> localhost | | .+=*+ | 2026-01-06 00:00:32.194796 | orchestrator -> localhost | | o=*oo | 2026-01-06 00:00:32.194816 | orchestrator -> localhost | | E.. .o.B | 2026-01-06 00:00:32.194866 | orchestrator -> localhost | | o=S ..oo= | 2026-01-06 00:00:32.194896 | orchestrator -> localhost | | o+oo. =++. | 2026-01-06 00:00:32.194918 | orchestrator -> localhost | | +.o + B . | 2026-01-06 00:00:32.194938 | orchestrator -> localhost | | . o . * o | 2026-01-06 00:00:32.194960 | orchestrator -> localhost | | . o | 2026-01-06 00:00:32.194981 | orchestrator -> localhost | +----[SHA256]-----+ 2026-01-06 00:00:32.195044 | orchestrator -> localhost | ok: Runtime: 0:00:01.772028 2026-01-06 00:00:32.202688 | 2026-01-06 00:00:32.202796 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2026-01-06 00:00:32.260756 | orchestrator | ok 2026-01-06 00:00:32.298360 | orchestrator | included: /var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2026-01-06 00:00:32.338369 | 2026-01-06 00:00:32.338529 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2026-01-06 00:00:32.368268 | orchestrator | skipping: Conditional result was False 2026-01-06 00:00:32.384444 | 2026-01-06 00:00:32.384586 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2026-01-06 00:00:33.826051 | orchestrator | changed 2026-01-06 00:00:33.849522 | 2026-01-06 00:00:33.849680 | TASK [add-build-sshkey : Make sure user has a .ssh] 2026-01-06 00:00:34.240798 | orchestrator | ok 2026-01-06 00:00:34.262378 | 2026-01-06 00:00:34.263248 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2026-01-06 00:00:34.961631 | orchestrator | ok 2026-01-06 00:00:34.976904 | 2026-01-06 00:00:34.977051 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2026-01-06 00:00:35.585199 | orchestrator | ok 2026-01-06 00:00:35.603738 | 2026-01-06 00:00:35.603889 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2026-01-06 00:00:35.671960 | orchestrator | skipping: Conditional result was False 2026-01-06 00:00:35.686102 | 2026-01-06 00:00:35.686268 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2026-01-06 00:00:37.337897 | orchestrator -> localhost | changed 2026-01-06 00:00:37.366158 | 2026-01-06 00:00:37.366497 | TASK [add-build-sshkey : Add back temp key] 2026-01-06 00:00:39.048085 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/41c85c8fe207460eae3f3f1c1a35cf27_id_rsa (zuul-build-sshkey) 2026-01-06 00:00:39.048304 | orchestrator -> localhost | ok: Runtime: 0:00:00.054947 2026-01-06 00:00:39.055437 | 2026-01-06 00:00:39.055549 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2026-01-06 00:00:40.034714 | orchestrator | ok 2026-01-06 00:00:40.047167 | 2026-01-06 00:00:40.047277 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2026-01-06 00:00:40.115477 | orchestrator | skipping: Conditional result was False 2026-01-06 00:00:40.255834 | 2026-01-06 00:00:40.255947 | TASK [start-zuul-console : Start zuul_console daemon.] 2026-01-06 00:00:40.936772 | orchestrator | ok 2026-01-06 00:00:40.968759 | 2026-01-06 00:00:40.968869 | TASK [validate-host : Define zuul_info_dir fact] 2026-01-06 00:00:41.049774 | orchestrator | ok 2026-01-06 00:00:41.086294 | 2026-01-06 00:00:41.086408 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2026-01-06 00:00:41.779167 | orchestrator -> localhost | ok 2026-01-06 00:00:41.788875 | 2026-01-06 00:00:41.789005 | TASK [validate-host : Collect information about the host] 2026-01-06 00:00:43.295251 | orchestrator | ok 2026-01-06 00:00:43.362935 | 2026-01-06 00:00:43.363073 | TASK [validate-host : Sanitize hostname] 2026-01-06 00:00:43.448670 | orchestrator | ok 2026-01-06 00:00:43.457005 | 2026-01-06 00:00:43.457388 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2026-01-06 00:00:44.827776 | orchestrator -> localhost | changed 2026-01-06 00:00:44.833732 | 2026-01-06 00:00:44.833822 | TASK [validate-host : Collect information about zuul worker] 2026-01-06 00:00:45.614037 | orchestrator | ok 2026-01-06 00:00:45.620214 | 2026-01-06 00:00:45.620311 | TASK [validate-host : Write out all zuul information for each host] 2026-01-06 00:00:47.341994 | orchestrator -> localhost | changed 2026-01-06 00:00:47.350562 | 2026-01-06 00:00:47.350652 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2026-01-06 00:00:47.687882 | orchestrator | ok 2026-01-06 00:00:47.699132 | 2026-01-06 00:00:47.699279 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2026-01-06 00:02:02.424874 | orchestrator | changed: 2026-01-06 00:02:02.426762 | orchestrator | .d..t...... src/ 2026-01-06 00:02:02.426947 | orchestrator | .d..t...... src/github.com/ 2026-01-06 00:02:02.426981 | orchestrator | .d..t...... src/github.com/osism/ 2026-01-06 00:02:02.427004 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2026-01-06 00:02:02.427025 | orchestrator | RedHat.yml 2026-01-06 00:02:02.441740 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2026-01-06 00:02:02.441758 | orchestrator | RedHat.yml 2026-01-06 00:02:02.441811 | orchestrator | = 1.53.0"... 2026-01-06 00:02:17.128319 | orchestrator | - Finding hashicorp/local versions matching ">= 2.2.0"... 2026-01-06 00:02:17.149645 | orchestrator | - Finding latest version of hashicorp/null... 2026-01-06 00:02:17.669141 | orchestrator | - Installing terraform-provider-openstack/openstack v3.4.0... 2026-01-06 00:02:18.613747 | orchestrator | - Installed terraform-provider-openstack/openstack v3.4.0 (signed, key ID 4F80527A391BEFD2) 2026-01-06 00:02:19.007882 | orchestrator | - Installing hashicorp/local v2.6.1... 2026-01-06 00:02:19.676497 | orchestrator | - Installed hashicorp/local v2.6.1 (signed, key ID 0C0AF313E5FD9F80) 2026-01-06 00:02:20.094235 | orchestrator | - Installing hashicorp/null v3.2.4... 2026-01-06 00:02:20.612790 | orchestrator | - Installed hashicorp/null v3.2.4 (signed, key ID 0C0AF313E5FD9F80) 2026-01-06 00:02:20.612892 | orchestrator | 2026-01-06 00:02:20.612900 | orchestrator | Providers are signed by their developers. 2026-01-06 00:02:20.612905 | orchestrator | If you'd like to know more about provider signing, you can read about it here: 2026-01-06 00:02:20.612911 | orchestrator | https://opentofu.org/docs/cli/plugins/signing/ 2026-01-06 00:02:20.612918 | orchestrator | 2026-01-06 00:02:20.612923 | orchestrator | OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2026-01-06 00:02:20.612928 | orchestrator | selections it made above. Include this file in your version control repository 2026-01-06 00:02:20.612950 | orchestrator | so that OpenTofu can guarantee to make the same selections by default when 2026-01-06 00:02:20.612954 | orchestrator | you run "tofu init" in the future. 2026-01-06 00:02:20.613221 | orchestrator | 2026-01-06 00:02:20.613252 | orchestrator | OpenTofu has been successfully initialized! 2026-01-06 00:02:20.613260 | orchestrator | 2026-01-06 00:02:20.613264 | orchestrator | You may now begin working with OpenTofu. Try running "tofu plan" to see 2026-01-06 00:02:20.613272 | orchestrator | any changes that are required for your infrastructure. All OpenTofu commands 2026-01-06 00:02:20.613276 | orchestrator | should now work. 2026-01-06 00:02:20.613281 | orchestrator | 2026-01-06 00:02:20.613290 | orchestrator | If you ever set or change modules or backend configuration for OpenTofu, 2026-01-06 00:02:20.613294 | orchestrator | rerun this command to reinitialize your working directory. If you forget, other 2026-01-06 00:02:20.613299 | orchestrator | commands will detect it and remind you to do so if necessary. 2026-01-06 00:02:20.803349 | orchestrator | Created and switched to workspace "ci"! 2026-01-06 00:02:20.803414 | orchestrator | 2026-01-06 00:02:20.803421 | orchestrator | You're now on a new, empty workspace. Workspaces isolate their state, 2026-01-06 00:02:20.803427 | orchestrator | so if you run "tofu plan" OpenTofu will not see any existing state 2026-01-06 00:02:20.803450 | orchestrator | for this configuration. 2026-01-06 00:02:20.914857 | orchestrator | ci.auto.tfvars 2026-01-06 00:02:20.921862 | orchestrator | default_custom.tf 2026-01-06 00:02:22.197351 | orchestrator | data.openstack_networking_network_v2.public: Reading... 2026-01-06 00:02:22.718415 | orchestrator | data.openstack_networking_network_v2.public: Read complete after 1s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2026-01-06 00:02:23.003230 | orchestrator | 2026-01-06 00:02:23.003314 | orchestrator | OpenTofu used the selected providers to generate the following execution 2026-01-06 00:02:23.003323 | orchestrator | plan. Resource actions are indicated with the following symbols: 2026-01-06 00:02:23.003329 | orchestrator | + create 2026-01-06 00:02:23.003335 | orchestrator | <= read (data resources) 2026-01-06 00:02:23.003340 | orchestrator | 2026-01-06 00:02:23.003345 | orchestrator | OpenTofu will perform the following actions: 2026-01-06 00:02:23.003359 | orchestrator | 2026-01-06 00:02:23.003364 | orchestrator | # data.openstack_images_image_v2.image will be read during apply 2026-01-06 00:02:23.003369 | orchestrator | # (config refers to values not yet known) 2026-01-06 00:02:23.003373 | orchestrator | <= data "openstack_images_image_v2" "image" { 2026-01-06 00:02:23.003378 | orchestrator | + checksum = (known after apply) 2026-01-06 00:02:23.003382 | orchestrator | + created_at = (known after apply) 2026-01-06 00:02:23.003386 | orchestrator | + file = (known after apply) 2026-01-06 00:02:23.003390 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003412 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.003416 | orchestrator | + min_disk_gb = (known after apply) 2026-01-06 00:02:23.003420 | orchestrator | + min_ram_mb = (known after apply) 2026-01-06 00:02:23.003424 | orchestrator | + most_recent = true 2026-01-06 00:02:23.003429 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.003433 | orchestrator | + protected = (known after apply) 2026-01-06 00:02:23.003437 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.003444 | orchestrator | + schema = (known after apply) 2026-01-06 00:02:23.003448 | orchestrator | + size_bytes = (known after apply) 2026-01-06 00:02:23.003452 | orchestrator | + tags = (known after apply) 2026-01-06 00:02:23.003456 | orchestrator | + updated_at = (known after apply) 2026-01-06 00:02:23.003460 | orchestrator | } 2026-01-06 00:02:23.003466 | orchestrator | 2026-01-06 00:02:23.003471 | orchestrator | # data.openstack_images_image_v2.image_node will be read during apply 2026-01-06 00:02:23.003475 | orchestrator | # (config refers to values not yet known) 2026-01-06 00:02:23.003479 | orchestrator | <= data "openstack_images_image_v2" "image_node" { 2026-01-06 00:02:23.003483 | orchestrator | + checksum = (known after apply) 2026-01-06 00:02:23.003487 | orchestrator | + created_at = (known after apply) 2026-01-06 00:02:23.003491 | orchestrator | + file = (known after apply) 2026-01-06 00:02:23.003495 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003499 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.003503 | orchestrator | + min_disk_gb = (known after apply) 2026-01-06 00:02:23.003506 | orchestrator | + min_ram_mb = (known after apply) 2026-01-06 00:02:23.003510 | orchestrator | + most_recent = true 2026-01-06 00:02:23.003514 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.003518 | orchestrator | + protected = (known after apply) 2026-01-06 00:02:23.003522 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.003526 | orchestrator | + schema = (known after apply) 2026-01-06 00:02:23.003529 | orchestrator | + size_bytes = (known after apply) 2026-01-06 00:02:23.003533 | orchestrator | + tags = (known after apply) 2026-01-06 00:02:23.003537 | orchestrator | + updated_at = (known after apply) 2026-01-06 00:02:23.003541 | orchestrator | } 2026-01-06 00:02:23.003549 | orchestrator | 2026-01-06 00:02:23.003555 | orchestrator | # local_file.MANAGER_ADDRESS will be created 2026-01-06 00:02:23.003559 | orchestrator | + resource "local_file" "MANAGER_ADDRESS" { 2026-01-06 00:02:23.003564 | orchestrator | + content = (known after apply) 2026-01-06 00:02:23.003568 | orchestrator | + content_base64sha256 = (known after apply) 2026-01-06 00:02:23.003572 | orchestrator | + content_base64sha512 = (known after apply) 2026-01-06 00:02:23.003576 | orchestrator | + content_md5 = (known after apply) 2026-01-06 00:02:23.003580 | orchestrator | + content_sha1 = (known after apply) 2026-01-06 00:02:23.003583 | orchestrator | + content_sha256 = (known after apply) 2026-01-06 00:02:23.003587 | orchestrator | + content_sha512 = (known after apply) 2026-01-06 00:02:23.003591 | orchestrator | + directory_permission = "0777" 2026-01-06 00:02:23.003595 | orchestrator | + file_permission = "0644" 2026-01-06 00:02:23.003599 | orchestrator | + filename = ".MANAGER_ADDRESS.ci" 2026-01-06 00:02:23.003603 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003606 | orchestrator | } 2026-01-06 00:02:23.003658 | orchestrator | 2026-01-06 00:02:23.003663 | orchestrator | # local_file.id_rsa_pub will be created 2026-01-06 00:02:23.003668 | orchestrator | + resource "local_file" "id_rsa_pub" { 2026-01-06 00:02:23.003671 | orchestrator | + content = (known after apply) 2026-01-06 00:02:23.003675 | orchestrator | + content_base64sha256 = (known after apply) 2026-01-06 00:02:23.003679 | orchestrator | + content_base64sha512 = (known after apply) 2026-01-06 00:02:23.003683 | orchestrator | + content_md5 = (known after apply) 2026-01-06 00:02:23.003687 | orchestrator | + content_sha1 = (known after apply) 2026-01-06 00:02:23.003690 | orchestrator | + content_sha256 = (known after apply) 2026-01-06 00:02:23.003694 | orchestrator | + content_sha512 = (known after apply) 2026-01-06 00:02:23.003698 | orchestrator | + directory_permission = "0777" 2026-01-06 00:02:23.003702 | orchestrator | + file_permission = "0644" 2026-01-06 00:02:23.003711 | orchestrator | + filename = ".id_rsa.ci.pub" 2026-01-06 00:02:23.003714 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003718 | orchestrator | } 2026-01-06 00:02:23.003757 | orchestrator | 2026-01-06 00:02:23.003769 | orchestrator | # local_file.inventory will be created 2026-01-06 00:02:23.003774 | orchestrator | + resource "local_file" "inventory" { 2026-01-06 00:02:23.003778 | orchestrator | + content = (known after apply) 2026-01-06 00:02:23.003781 | orchestrator | + content_base64sha256 = (known after apply) 2026-01-06 00:02:23.003785 | orchestrator | + content_base64sha512 = (known after apply) 2026-01-06 00:02:23.003789 | orchestrator | + content_md5 = (known after apply) 2026-01-06 00:02:23.003793 | orchestrator | + content_sha1 = (known after apply) 2026-01-06 00:02:23.003797 | orchestrator | + content_sha256 = (known after apply) 2026-01-06 00:02:23.003801 | orchestrator | + content_sha512 = (known after apply) 2026-01-06 00:02:23.003805 | orchestrator | + directory_permission = "0777" 2026-01-06 00:02:23.003809 | orchestrator | + file_permission = "0644" 2026-01-06 00:02:23.003813 | orchestrator | + filename = "inventory.ci" 2026-01-06 00:02:23.003816 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003820 | orchestrator | } 2026-01-06 00:02:23.003855 | orchestrator | 2026-01-06 00:02:23.003861 | orchestrator | # local_sensitive_file.id_rsa will be created 2026-01-06 00:02:23.003865 | orchestrator | + resource "local_sensitive_file" "id_rsa" { 2026-01-06 00:02:23.003869 | orchestrator | + content = (sensitive value) 2026-01-06 00:02:23.003872 | orchestrator | + content_base64sha256 = (known after apply) 2026-01-06 00:02:23.003876 | orchestrator | + content_base64sha512 = (known after apply) 2026-01-06 00:02:23.003880 | orchestrator | + content_md5 = (known after apply) 2026-01-06 00:02:23.003884 | orchestrator | + content_sha1 = (known after apply) 2026-01-06 00:02:23.003888 | orchestrator | + content_sha256 = (known after apply) 2026-01-06 00:02:23.003892 | orchestrator | + content_sha512 = (known after apply) 2026-01-06 00:02:23.003896 | orchestrator | + directory_permission = "0700" 2026-01-06 00:02:23.003899 | orchestrator | + file_permission = "0600" 2026-01-06 00:02:23.003903 | orchestrator | + filename = ".id_rsa.ci" 2026-01-06 00:02:23.003907 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003911 | orchestrator | } 2026-01-06 00:02:23.003917 | orchestrator | 2026-01-06 00:02:23.003921 | orchestrator | # null_resource.node_semaphore will be created 2026-01-06 00:02:23.003925 | orchestrator | + resource "null_resource" "node_semaphore" { 2026-01-06 00:02:23.003929 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003933 | orchestrator | } 2026-01-06 00:02:23.003959 | orchestrator | 2026-01-06 00:02:23.003965 | orchestrator | # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2026-01-06 00:02:23.003969 | orchestrator | + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2026-01-06 00:02:23.003973 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.003976 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.003980 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.003984 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.003988 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.003992 | orchestrator | + name = "testbed-volume-manager-base" 2026-01-06 00:02:23.003996 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004000 | orchestrator | + size = 80 2026-01-06 00:02:23.004004 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004007 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004022 | orchestrator | } 2026-01-06 00:02:23.004067 | orchestrator | 2026-01-06 00:02:23.004072 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2026-01-06 00:02:23.004076 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004080 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004084 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004088 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004096 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004100 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004104 | orchestrator | + name = "testbed-volume-0-node-base" 2026-01-06 00:02:23.004108 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004112 | orchestrator | + size = 80 2026-01-06 00:02:23.004116 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004119 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004123 | orchestrator | } 2026-01-06 00:02:23.004163 | orchestrator | 2026-01-06 00:02:23.004168 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2026-01-06 00:02:23.004172 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004176 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004180 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004184 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004188 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004192 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004196 | orchestrator | + name = "testbed-volume-1-node-base" 2026-01-06 00:02:23.004205 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004209 | orchestrator | + size = 80 2026-01-06 00:02:23.004213 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004217 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004221 | orchestrator | } 2026-01-06 00:02:23.004254 | orchestrator | 2026-01-06 00:02:23.004260 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2026-01-06 00:02:23.004263 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004267 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004271 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004275 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004279 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004283 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004287 | orchestrator | + name = "testbed-volume-2-node-base" 2026-01-06 00:02:23.004290 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004294 | orchestrator | + size = 80 2026-01-06 00:02:23.004298 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004302 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004306 | orchestrator | } 2026-01-06 00:02:23.004357 | orchestrator | 2026-01-06 00:02:23.004362 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2026-01-06 00:02:23.004366 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004370 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004374 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004378 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004382 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004386 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004392 | orchestrator | + name = "testbed-volume-3-node-base" 2026-01-06 00:02:23.004396 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004400 | orchestrator | + size = 80 2026-01-06 00:02:23.004404 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004408 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004412 | orchestrator | } 2026-01-06 00:02:23.004440 | orchestrator | 2026-01-06 00:02:23.004446 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2026-01-06 00:02:23.004450 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004454 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004458 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004461 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004471 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004475 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004479 | orchestrator | + name = "testbed-volume-4-node-base" 2026-01-06 00:02:23.004483 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004487 | orchestrator | + size = 80 2026-01-06 00:02:23.004491 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004495 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004499 | orchestrator | } 2026-01-06 00:02:23.004532 | orchestrator | 2026-01-06 00:02:23.004538 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2026-01-06 00:02:23.004542 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-01-06 00:02:23.004546 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004550 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004554 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004557 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.004561 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004565 | orchestrator | + name = "testbed-volume-5-node-base" 2026-01-06 00:02:23.004569 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004573 | orchestrator | + size = 80 2026-01-06 00:02:23.004577 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004581 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004584 | orchestrator | } 2026-01-06 00:02:23.004611 | orchestrator | 2026-01-06 00:02:23.004617 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[0] will be created 2026-01-06 00:02:23.004621 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.004625 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004628 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004632 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004636 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004640 | orchestrator | + name = "testbed-volume-0-node-3" 2026-01-06 00:02:23.004644 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004648 | orchestrator | + size = 20 2026-01-06 00:02:23.004652 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004655 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004659 | orchestrator | } 2026-01-06 00:02:23.004691 | orchestrator | 2026-01-06 00:02:23.004696 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[1] will be created 2026-01-06 00:02:23.004700 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.004704 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004708 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004712 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004716 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004720 | orchestrator | + name = "testbed-volume-1-node-4" 2026-01-06 00:02:23.004723 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004727 | orchestrator | + size = 20 2026-01-06 00:02:23.004731 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004735 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004739 | orchestrator | } 2026-01-06 00:02:23.004782 | orchestrator | 2026-01-06 00:02:23.004788 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[2] will be created 2026-01-06 00:02:23.004792 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.004795 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004799 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004803 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004807 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004811 | orchestrator | + name = "testbed-volume-2-node-5" 2026-01-06 00:02:23.004814 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004822 | orchestrator | + size = 20 2026-01-06 00:02:23.004826 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004830 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004834 | orchestrator | } 2026-01-06 00:02:23.004864 | orchestrator | 2026-01-06 00:02:23.004870 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[3] will be created 2026-01-06 00:02:23.004873 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.004877 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004881 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004885 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004889 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004893 | orchestrator | + name = "testbed-volume-3-node-3" 2026-01-06 00:02:23.004896 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004900 | orchestrator | + size = 20 2026-01-06 00:02:23.004904 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.004908 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.004912 | orchestrator | } 2026-01-06 00:02:23.004959 | orchestrator | 2026-01-06 00:02:23.004965 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[4] will be created 2026-01-06 00:02:23.004968 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.004972 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.004976 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.004980 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.004984 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.004988 | orchestrator | + name = "testbed-volume-4-node-4" 2026-01-06 00:02:23.004991 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.004998 | orchestrator | + size = 20 2026-01-06 00:02:23.005002 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.005006 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.005010 | orchestrator | } 2026-01-06 00:02:23.005087 | orchestrator | 2026-01-06 00:02:23.005093 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[5] will be created 2026-01-06 00:02:23.005097 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.005101 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.005104 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005108 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.005112 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.005116 | orchestrator | + name = "testbed-volume-5-node-5" 2026-01-06 00:02:23.005120 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.005123 | orchestrator | + size = 20 2026-01-06 00:02:23.005127 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.005131 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.005135 | orchestrator | } 2026-01-06 00:02:23.005174 | orchestrator | 2026-01-06 00:02:23.005180 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[6] will be created 2026-01-06 00:02:23.005184 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.005188 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.005191 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005195 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.005199 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.005203 | orchestrator | + name = "testbed-volume-6-node-3" 2026-01-06 00:02:23.005207 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.005211 | orchestrator | + size = 20 2026-01-06 00:02:23.005215 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.005219 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.005222 | orchestrator | } 2026-01-06 00:02:23.005256 | orchestrator | 2026-01-06 00:02:23.005262 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[7] will be created 2026-01-06 00:02:23.005266 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.005274 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.005278 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005282 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.005286 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.005290 | orchestrator | + name = "testbed-volume-7-node-4" 2026-01-06 00:02:23.005294 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.005298 | orchestrator | + size = 20 2026-01-06 00:02:23.005302 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.005306 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.005310 | orchestrator | } 2026-01-06 00:02:23.005342 | orchestrator | 2026-01-06 00:02:23.005348 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[8] will be created 2026-01-06 00:02:23.005352 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-01-06 00:02:23.005357 | orchestrator | + attachment = (known after apply) 2026-01-06 00:02:23.005361 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005364 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.005368 | orchestrator | + metadata = (known after apply) 2026-01-06 00:02:23.005372 | orchestrator | + name = "testbed-volume-8-node-5" 2026-01-06 00:02:23.005376 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.005380 | orchestrator | + size = 20 2026-01-06 00:02:23.005384 | orchestrator | + volume_retype_policy = "never" 2026-01-06 00:02:23.005388 | orchestrator | + volume_type = "ssd" 2026-01-06 00:02:23.005392 | orchestrator | } 2026-01-06 00:02:23.005651 | orchestrator | 2026-01-06 00:02:23.005656 | orchestrator | # openstack_compute_instance_v2.manager_server will be created 2026-01-06 00:02:23.005660 | orchestrator | + resource "openstack_compute_instance_v2" "manager_server" { 2026-01-06 00:02:23.005664 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.005668 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.005672 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.005676 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.005680 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005684 | orchestrator | + config_drive = true 2026-01-06 00:02:23.005687 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.005691 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.005695 | orchestrator | + flavor_name = "OSISM-4V-16" 2026-01-06 00:02:23.005699 | orchestrator | + force_delete = false 2026-01-06 00:02:23.005702 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.005706 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.005710 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.005714 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.005718 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.005722 | orchestrator | + name = "testbed-manager" 2026-01-06 00:02:23.005725 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.005729 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.005733 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.005737 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.005741 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.005744 | orchestrator | + user_data = (sensitive value) 2026-01-06 00:02:23.005748 | orchestrator | 2026-01-06 00:02:23.005752 | orchestrator | + block_device { 2026-01-06 00:02:23.005756 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.005760 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.005767 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.005771 | orchestrator | + multiattach = false 2026-01-06 00:02:23.005775 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.005779 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.005786 | orchestrator | } 2026-01-06 00:02:23.005790 | orchestrator | 2026-01-06 00:02:23.005794 | orchestrator | + network { 2026-01-06 00:02:23.005798 | orchestrator | + access_network = false 2026-01-06 00:02:23.005802 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.005805 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.005809 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.005813 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.005817 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.005821 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.005825 | orchestrator | } 2026-01-06 00:02:23.005828 | orchestrator | } 2026-01-06 00:02:23.005936 | orchestrator | 2026-01-06 00:02:23.005948 | orchestrator | # openstack_compute_instance_v2.node_server[0] will be created 2026-01-06 00:02:23.005952 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.005956 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.005960 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.005963 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.005967 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.005971 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.005975 | orchestrator | + config_drive = true 2026-01-06 00:02:23.005979 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.005982 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.005986 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.005990 | orchestrator | + force_delete = false 2026-01-06 00:02:23.005994 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.005998 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.006001 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.006005 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.006009 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.006039 | orchestrator | + name = "testbed-node-0" 2026-01-06 00:02:23.006043 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.006047 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.006050 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.006055 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.006060 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.006064 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.006068 | orchestrator | 2026-01-06 00:02:23.006071 | orchestrator | + block_device { 2026-01-06 00:02:23.006075 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.006079 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.006083 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.006087 | orchestrator | + multiattach = false 2026-01-06 00:02:23.006090 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.006094 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006098 | orchestrator | } 2026-01-06 00:02:23.006102 | orchestrator | 2026-01-06 00:02:23.006106 | orchestrator | + network { 2026-01-06 00:02:23.006110 | orchestrator | + access_network = false 2026-01-06 00:02:23.006113 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.006117 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.006121 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.006125 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.006129 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.006133 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006136 | orchestrator | } 2026-01-06 00:02:23.006140 | orchestrator | } 2026-01-06 00:02:23.006229 | orchestrator | 2026-01-06 00:02:23.006236 | orchestrator | # openstack_compute_instance_v2.node_server[1] will be created 2026-01-06 00:02:23.006240 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.006244 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.006252 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.006256 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.006259 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.006263 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.006267 | orchestrator | + config_drive = true 2026-01-06 00:02:23.006271 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.006274 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.006278 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.006282 | orchestrator | + force_delete = false 2026-01-06 00:02:23.006286 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.006290 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.006293 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.006297 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.006301 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.006305 | orchestrator | + name = "testbed-node-1" 2026-01-06 00:02:23.006308 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.006312 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.006316 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.006320 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.006324 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.006328 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.006331 | orchestrator | 2026-01-06 00:02:23.006335 | orchestrator | + block_device { 2026-01-06 00:02:23.006339 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.006343 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.006347 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.006350 | orchestrator | + multiattach = false 2026-01-06 00:02:23.006354 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.006358 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006362 | orchestrator | } 2026-01-06 00:02:23.006365 | orchestrator | 2026-01-06 00:02:23.006369 | orchestrator | + network { 2026-01-06 00:02:23.006373 | orchestrator | + access_network = false 2026-01-06 00:02:23.006377 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.006381 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.006384 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.006388 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.006392 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.006396 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006400 | orchestrator | } 2026-01-06 00:02:23.006403 | orchestrator | } 2026-01-06 00:02:23.006554 | orchestrator | 2026-01-06 00:02:23.006563 | orchestrator | # openstack_compute_instance_v2.node_server[2] will be created 2026-01-06 00:02:23.006567 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.006571 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.006575 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.006579 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.006583 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.006595 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.006599 | orchestrator | + config_drive = true 2026-01-06 00:02:23.006603 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.006606 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.006610 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.006614 | orchestrator | + force_delete = false 2026-01-06 00:02:23.006618 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.006622 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.006625 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.006634 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.006639 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.006642 | orchestrator | + name = "testbed-node-2" 2026-01-06 00:02:23.006646 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.006650 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.006654 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.006658 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.006662 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.006665 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.006669 | orchestrator | 2026-01-06 00:02:23.006673 | orchestrator | + block_device { 2026-01-06 00:02:23.006677 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.006681 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.006685 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.006688 | orchestrator | + multiattach = false 2026-01-06 00:02:23.006692 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.006696 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006700 | orchestrator | } 2026-01-06 00:02:23.006704 | orchestrator | 2026-01-06 00:02:23.006708 | orchestrator | + network { 2026-01-06 00:02:23.006712 | orchestrator | + access_network = false 2026-01-06 00:02:23.006715 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.006719 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.006723 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.006727 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.006731 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.006734 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006738 | orchestrator | } 2026-01-06 00:02:23.006742 | orchestrator | } 2026-01-06 00:02:23.006769 | orchestrator | 2026-01-06 00:02:23.006775 | orchestrator | # openstack_compute_instance_v2.node_server[3] will be created 2026-01-06 00:02:23.006778 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.006782 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.006786 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.006790 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.006794 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.006798 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.006801 | orchestrator | + config_drive = true 2026-01-06 00:02:23.006805 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.006809 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.006813 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.006817 | orchestrator | + force_delete = false 2026-01-06 00:02:23.006820 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.006824 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.006828 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.006832 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.006836 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.006840 | orchestrator | + name = "testbed-node-3" 2026-01-06 00:02:23.006843 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.006847 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.006851 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.006855 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.006859 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.006863 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.006866 | orchestrator | 2026-01-06 00:02:23.006870 | orchestrator | + block_device { 2026-01-06 00:02:23.006878 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.006882 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.006886 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.006894 | orchestrator | + multiattach = false 2026-01-06 00:02:23.006898 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.006902 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006906 | orchestrator | } 2026-01-06 00:02:23.006910 | orchestrator | 2026-01-06 00:02:23.006913 | orchestrator | + network { 2026-01-06 00:02:23.006917 | orchestrator | + access_network = false 2026-01-06 00:02:23.006921 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.006925 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.006929 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.006933 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.006936 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.006940 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.006944 | orchestrator | } 2026-01-06 00:02:23.006948 | orchestrator | } 2026-01-06 00:02:23.007135 | orchestrator | 2026-01-06 00:02:23.007142 | orchestrator | # openstack_compute_instance_v2.node_server[4] will be created 2026-01-06 00:02:23.007146 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.007150 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.007154 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.007157 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.007161 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.007165 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.007169 | orchestrator | + config_drive = true 2026-01-06 00:02:23.007172 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.007176 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.007180 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.007184 | orchestrator | + force_delete = false 2026-01-06 00:02:23.007187 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.007191 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007195 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.007199 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.007202 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.007206 | orchestrator | + name = "testbed-node-4" 2026-01-06 00:02:23.007210 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.007224 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007229 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.007232 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.007236 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.007240 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.007244 | orchestrator | 2026-01-06 00:02:23.007248 | orchestrator | + block_device { 2026-01-06 00:02:23.007251 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.007255 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.007259 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.007263 | orchestrator | + multiattach = false 2026-01-06 00:02:23.007266 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.007270 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.007274 | orchestrator | } 2026-01-06 00:02:23.007278 | orchestrator | 2026-01-06 00:02:23.007282 | orchestrator | + network { 2026-01-06 00:02:23.007285 | orchestrator | + access_network = false 2026-01-06 00:02:23.007289 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.007293 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.007297 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.007301 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.007304 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.007308 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.007312 | orchestrator | } 2026-01-06 00:02:23.007316 | orchestrator | } 2026-01-06 00:02:23.007414 | orchestrator | 2026-01-06 00:02:23.007420 | orchestrator | # openstack_compute_instance_v2.node_server[5] will be created 2026-01-06 00:02:23.007424 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-01-06 00:02:23.007428 | orchestrator | + access_ip_v4 = (known after apply) 2026-01-06 00:02:23.007432 | orchestrator | + access_ip_v6 = (known after apply) 2026-01-06 00:02:23.007435 | orchestrator | + all_metadata = (known after apply) 2026-01-06 00:02:23.007439 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.007443 | orchestrator | + availability_zone = "nova" 2026-01-06 00:02:23.007447 | orchestrator | + config_drive = true 2026-01-06 00:02:23.007451 | orchestrator | + created = (known after apply) 2026-01-06 00:02:23.007454 | orchestrator | + flavor_id = (known after apply) 2026-01-06 00:02:23.007458 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-01-06 00:02:23.007462 | orchestrator | + force_delete = false 2026-01-06 00:02:23.007469 | orchestrator | + hypervisor_hostname = (known after apply) 2026-01-06 00:02:23.007473 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007477 | orchestrator | + image_id = (known after apply) 2026-01-06 00:02:23.007481 | orchestrator | + image_name = (known after apply) 2026-01-06 00:02:23.007485 | orchestrator | + key_pair = "testbed" 2026-01-06 00:02:23.007489 | orchestrator | + name = "testbed-node-5" 2026-01-06 00:02:23.007493 | orchestrator | + power_state = "active" 2026-01-06 00:02:23.007496 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007500 | orchestrator | + security_groups = (known after apply) 2026-01-06 00:02:23.007504 | orchestrator | + stop_before_destroy = false 2026-01-06 00:02:23.007508 | orchestrator | + updated = (known after apply) 2026-01-06 00:02:23.007511 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-01-06 00:02:23.007515 | orchestrator | 2026-01-06 00:02:23.007519 | orchestrator | + block_device { 2026-01-06 00:02:23.007523 | orchestrator | + boot_index = 0 2026-01-06 00:02:23.007527 | orchestrator | + delete_on_termination = false 2026-01-06 00:02:23.007530 | orchestrator | + destination_type = "volume" 2026-01-06 00:02:23.007534 | orchestrator | + multiattach = false 2026-01-06 00:02:23.007538 | orchestrator | + source_type = "volume" 2026-01-06 00:02:23.007542 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.007545 | orchestrator | } 2026-01-06 00:02:23.007549 | orchestrator | 2026-01-06 00:02:23.007553 | orchestrator | + network { 2026-01-06 00:02:23.007557 | orchestrator | + access_network = false 2026-01-06 00:02:23.007561 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-01-06 00:02:23.007565 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-01-06 00:02:23.007568 | orchestrator | + mac = (known after apply) 2026-01-06 00:02:23.007572 | orchestrator | + name = (known after apply) 2026-01-06 00:02:23.007576 | orchestrator | + port = (known after apply) 2026-01-06 00:02:23.007580 | orchestrator | + uuid = (known after apply) 2026-01-06 00:02:23.007584 | orchestrator | } 2026-01-06 00:02:23.007587 | orchestrator | } 2026-01-06 00:02:23.007594 | orchestrator | 2026-01-06 00:02:23.007597 | orchestrator | # openstack_compute_keypair_v2.key will be created 2026-01-06 00:02:23.007601 | orchestrator | + resource "openstack_compute_keypair_v2" "key" { 2026-01-06 00:02:23.007605 | orchestrator | + fingerprint = (known after apply) 2026-01-06 00:02:23.007609 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007613 | orchestrator | + name = "testbed" 2026-01-06 00:02:23.007616 | orchestrator | + private_key = (sensitive value) 2026-01-06 00:02:23.007620 | orchestrator | + public_key = (known after apply) 2026-01-06 00:02:23.007624 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007628 | orchestrator | + user_id = (known after apply) 2026-01-06 00:02:23.007632 | orchestrator | } 2026-01-06 00:02:23.007636 | orchestrator | 2026-01-06 00:02:23.007639 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2026-01-06 00:02:23.007643 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007651 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007655 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007659 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007663 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007666 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007670 | orchestrator | } 2026-01-06 00:02:23.007674 | orchestrator | 2026-01-06 00:02:23.007678 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2026-01-06 00:02:23.007682 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007686 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007690 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007693 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007697 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007701 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007705 | orchestrator | } 2026-01-06 00:02:23.007708 | orchestrator | 2026-01-06 00:02:23.007712 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2026-01-06 00:02:23.007716 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007720 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007724 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007728 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007731 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007735 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007739 | orchestrator | } 2026-01-06 00:02:23.007745 | orchestrator | 2026-01-06 00:02:23.007749 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2026-01-06 00:02:23.007753 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007757 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007760 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007764 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007768 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007772 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007776 | orchestrator | } 2026-01-06 00:02:23.007779 | orchestrator | 2026-01-06 00:02:23.007783 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2026-01-06 00:02:23.007787 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007791 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007795 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007798 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007805 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007809 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007813 | orchestrator | } 2026-01-06 00:02:23.007817 | orchestrator | 2026-01-06 00:02:23.007821 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2026-01-06 00:02:23.007825 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007828 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007832 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007836 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007840 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007844 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007848 | orchestrator | } 2026-01-06 00:02:23.007851 | orchestrator | 2026-01-06 00:02:23.007855 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2026-01-06 00:02:23.007859 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007863 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007867 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007870 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007874 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007881 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007885 | orchestrator | } 2026-01-06 00:02:23.007891 | orchestrator | 2026-01-06 00:02:23.007895 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2026-01-06 00:02:23.007899 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007903 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007907 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007911 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007914 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007918 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007922 | orchestrator | } 2026-01-06 00:02:23.007926 | orchestrator | 2026-01-06 00:02:23.007930 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2026-01-06 00:02:23.007934 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-01-06 00:02:23.007937 | orchestrator | + device = (known after apply) 2026-01-06 00:02:23.007941 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007945 | orchestrator | + instance_id = (known after apply) 2026-01-06 00:02:23.007949 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007952 | orchestrator | + volume_id = (known after apply) 2026-01-06 00:02:23.007956 | orchestrator | } 2026-01-06 00:02:23.007960 | orchestrator | 2026-01-06 00:02:23.007964 | orchestrator | # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2026-01-06 00:02:23.007969 | orchestrator | + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2026-01-06 00:02:23.007973 | orchestrator | + fixed_ip = (known after apply) 2026-01-06 00:02:23.007977 | orchestrator | + floating_ip = (known after apply) 2026-01-06 00:02:23.007980 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.007984 | orchestrator | + port_id = (known after apply) 2026-01-06 00:02:23.007988 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.007992 | orchestrator | } 2026-01-06 00:02:23.007998 | orchestrator | 2026-01-06 00:02:23.008001 | orchestrator | # openstack_networking_floatingip_v2.manager_floating_ip will be created 2026-01-06 00:02:23.008006 | orchestrator | + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2026-01-06 00:02:23.008009 | orchestrator | + address = (known after apply) 2026-01-06 00:02:23.008022 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.008026 | orchestrator | + dns_domain = (known after apply) 2026-01-06 00:02:23.008030 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.008034 | orchestrator | + fixed_ip = (known after apply) 2026-01-06 00:02:23.008038 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.008042 | orchestrator | + pool = "public" 2026-01-06 00:02:23.008047 | orchestrator | + port_id = (known after apply) 2026-01-06 00:02:23.008053 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.008059 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.008065 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.008070 | orchestrator | } 2026-01-06 00:02:23.008079 | orchestrator | 2026-01-06 00:02:23.008084 | orchestrator | # openstack_networking_network_v2.net_management will be created 2026-01-06 00:02:23.008091 | orchestrator | + resource "openstack_networking_network_v2" "net_management" { 2026-01-06 00:02:23.008096 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.008103 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.008109 | orchestrator | + availability_zone_hints = [ 2026-01-06 00:02:23.008113 | orchestrator | + "nova", 2026-01-06 00:02:23.008117 | orchestrator | ] 2026-01-06 00:02:23.008121 | orchestrator | + dns_domain = (known after apply) 2026-01-06 00:02:23.008125 | orchestrator | + external = (known after apply) 2026-01-06 00:02:23.008129 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.008133 | orchestrator | + mtu = (known after apply) 2026-01-06 00:02:23.008136 | orchestrator | + name = "net-testbed-management" 2026-01-06 00:02:23.008140 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.008148 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.008152 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.008156 | orchestrator | + shared = (known after apply) 2026-01-06 00:02:23.008160 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.008164 | orchestrator | + transparent_vlan = (known after apply) 2026-01-06 00:02:23.008168 | orchestrator | 2026-01-06 00:02:23.008171 | orchestrator | + segments (known after apply) 2026-01-06 00:02:23.008175 | orchestrator | } 2026-01-06 00:02:23.008254 | orchestrator | 2026-01-06 00:02:23.008260 | orchestrator | # openstack_networking_port_v2.manager_port_management will be created 2026-01-06 00:02:23.008264 | orchestrator | + resource "openstack_networking_port_v2" "manager_port_management" { 2026-01-06 00:02:23.008268 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.008272 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.008276 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.008283 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.008287 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.008291 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.008295 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.008299 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.008302 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.008306 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.008310 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.008314 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.008318 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.008322 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.008325 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.008329 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.008333 | orchestrator | 2026-01-06 00:02:23.008337 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008341 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.008345 | orchestrator | } 2026-01-06 00:02:23.008348 | orchestrator | 2026-01-06 00:02:23.008352 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.008356 | orchestrator | 2026-01-06 00:02:23.008360 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.008364 | orchestrator | + ip_address = "192.168.16.5" 2026-01-06 00:02:23.008368 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.008372 | orchestrator | } 2026-01-06 00:02:23.008376 | orchestrator | } 2026-01-06 00:02:23.008544 | orchestrator | 2026-01-06 00:02:23.008551 | orchestrator | # openstack_networking_port_v2.node_port_management[0] will be created 2026-01-06 00:02:23.008555 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.008559 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.008562 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.008566 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.008570 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.008574 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.008578 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.008582 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.008585 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.008589 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.008593 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.008597 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.008601 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.008604 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.008608 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.008616 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.008620 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.008624 | orchestrator | 2026-01-06 00:02:23.008628 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008632 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.008635 | orchestrator | } 2026-01-06 00:02:23.008639 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008643 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.008647 | orchestrator | } 2026-01-06 00:02:23.008651 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008655 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.008658 | orchestrator | } 2026-01-06 00:02:23.008662 | orchestrator | 2026-01-06 00:02:23.008666 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.008670 | orchestrator | 2026-01-06 00:02:23.008674 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.008678 | orchestrator | + ip_address = "192.168.16.10" 2026-01-06 00:02:23.008682 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.008685 | orchestrator | } 2026-01-06 00:02:23.008689 | orchestrator | } 2026-01-06 00:02:23.008827 | orchestrator | 2026-01-06 00:02:23.008832 | orchestrator | # openstack_networking_port_v2.node_port_management[1] will be created 2026-01-06 00:02:23.008836 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.008840 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.008844 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.008848 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.008852 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.008856 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.008859 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.008863 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.008867 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.008871 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.008875 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.008879 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.008883 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.008886 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.008890 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.008894 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.008898 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.008902 | orchestrator | 2026-01-06 00:02:23.008906 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008910 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.008913 | orchestrator | } 2026-01-06 00:02:23.008917 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008921 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.008925 | orchestrator | } 2026-01-06 00:02:23.008929 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.008933 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.008937 | orchestrator | } 2026-01-06 00:02:23.008940 | orchestrator | 2026-01-06 00:02:23.008944 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.008948 | orchestrator | 2026-01-06 00:02:23.008952 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.008956 | orchestrator | + ip_address = "192.168.16.11" 2026-01-06 00:02:23.008960 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.008963 | orchestrator | } 2026-01-06 00:02:23.008967 | orchestrator | } 2026-01-06 00:02:23.009156 | orchestrator | 2026-01-06 00:02:23.009161 | orchestrator | # openstack_networking_port_v2.node_port_management[2] will be created 2026-01-06 00:02:23.009165 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.009169 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.009173 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.009177 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.009181 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.009189 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.009193 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.009197 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.009201 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.009208 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009212 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.009216 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.009220 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.009224 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.009228 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009232 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.009235 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.009239 | orchestrator | 2026-01-06 00:02:23.009243 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009247 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.009251 | orchestrator | } 2026-01-06 00:02:23.009255 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009259 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.009263 | orchestrator | } 2026-01-06 00:02:23.009267 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009271 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.009274 | orchestrator | } 2026-01-06 00:02:23.009278 | orchestrator | 2026-01-06 00:02:23.009282 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.009286 | orchestrator | 2026-01-06 00:02:23.009290 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.009294 | orchestrator | + ip_address = "192.168.16.12" 2026-01-06 00:02:23.009297 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.009301 | orchestrator | } 2026-01-06 00:02:23.009305 | orchestrator | } 2026-01-06 00:02:23.009357 | orchestrator | 2026-01-06 00:02:23.009362 | orchestrator | # openstack_networking_port_v2.node_port_management[3] will be created 2026-01-06 00:02:23.009366 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.009370 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.009374 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.009378 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.009381 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.009385 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.009389 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.009393 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.009397 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.009401 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009404 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.009408 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.009412 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.009416 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.009420 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009424 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.009428 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.009431 | orchestrator | 2026-01-06 00:02:23.009435 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009439 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.009443 | orchestrator | } 2026-01-06 00:02:23.009447 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009451 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.009455 | orchestrator | } 2026-01-06 00:02:23.009459 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009462 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.009466 | orchestrator | } 2026-01-06 00:02:23.009470 | orchestrator | 2026-01-06 00:02:23.009477 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.009481 | orchestrator | 2026-01-06 00:02:23.009485 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.009489 | orchestrator | + ip_address = "192.168.16.13" 2026-01-06 00:02:23.009492 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.009496 | orchestrator | } 2026-01-06 00:02:23.009500 | orchestrator | } 2026-01-06 00:02:23.009558 | orchestrator | 2026-01-06 00:02:23.009563 | orchestrator | # openstack_networking_port_v2.node_port_management[4] will be created 2026-01-06 00:02:23.009567 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.009571 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.009575 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.009579 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.009583 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.009587 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.009590 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.009594 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.009598 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.009602 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009606 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.009610 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.009614 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.009617 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.009621 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009625 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.009629 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.009634 | orchestrator | 2026-01-06 00:02:23.009637 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009641 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.009645 | orchestrator | } 2026-01-06 00:02:23.009649 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009653 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.009657 | orchestrator | } 2026-01-06 00:02:23.009661 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009665 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.009668 | orchestrator | } 2026-01-06 00:02:23.009672 | orchestrator | 2026-01-06 00:02:23.009676 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.009680 | orchestrator | 2026-01-06 00:02:23.009684 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.009688 | orchestrator | + ip_address = "192.168.16.14" 2026-01-06 00:02:23.009691 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.009695 | orchestrator | } 2026-01-06 00:02:23.009699 | orchestrator | } 2026-01-06 00:02:23.009743 | orchestrator | 2026-01-06 00:02:23.009748 | orchestrator | # openstack_networking_port_v2.node_port_management[5] will be created 2026-01-06 00:02:23.009752 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-01-06 00:02:23.009756 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.009760 | orchestrator | + all_fixed_ips = (known after apply) 2026-01-06 00:02:23.009764 | orchestrator | + all_security_group_ids = (known after apply) 2026-01-06 00:02:23.009768 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.009772 | orchestrator | + device_id = (known after apply) 2026-01-06 00:02:23.009776 | orchestrator | + device_owner = (known after apply) 2026-01-06 00:02:23.009780 | orchestrator | + dns_assignment = (known after apply) 2026-01-06 00:02:23.009784 | orchestrator | + dns_name = (known after apply) 2026-01-06 00:02:23.009787 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009791 | orchestrator | + mac_address = (known after apply) 2026-01-06 00:02:23.009795 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.009799 | orchestrator | + port_security_enabled = (known after apply) 2026-01-06 00:02:23.009803 | orchestrator | + qos_policy_id = (known after apply) 2026-01-06 00:02:23.009812 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009816 | orchestrator | + security_group_ids = (known after apply) 2026-01-06 00:02:23.009820 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.009824 | orchestrator | 2026-01-06 00:02:23.009828 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009832 | orchestrator | + ip_address = "192.168.16.254/32" 2026-01-06 00:02:23.009836 | orchestrator | } 2026-01-06 00:02:23.009840 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009844 | orchestrator | + ip_address = "192.168.16.8/32" 2026-01-06 00:02:23.009848 | orchestrator | } 2026-01-06 00:02:23.009851 | orchestrator | + allowed_address_pairs { 2026-01-06 00:02:23.009855 | orchestrator | + ip_address = "192.168.16.9/32" 2026-01-06 00:02:23.009859 | orchestrator | } 2026-01-06 00:02:23.009863 | orchestrator | 2026-01-06 00:02:23.009870 | orchestrator | + binding (known after apply) 2026-01-06 00:02:23.009874 | orchestrator | 2026-01-06 00:02:23.009878 | orchestrator | + fixed_ip { 2026-01-06 00:02:23.009881 | orchestrator | + ip_address = "192.168.16.15" 2026-01-06 00:02:23.009885 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.009889 | orchestrator | } 2026-01-06 00:02:23.009893 | orchestrator | } 2026-01-06 00:02:23.009899 | orchestrator | 2026-01-06 00:02:23.009903 | orchestrator | # openstack_networking_router_interface_v2.router_interface will be created 2026-01-06 00:02:23.009907 | orchestrator | + resource "openstack_networking_router_interface_v2" "router_interface" { 2026-01-06 00:02:23.009911 | orchestrator | + force_destroy = false 2026-01-06 00:02:23.009915 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009919 | orchestrator | + port_id = (known after apply) 2026-01-06 00:02:23.009922 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009926 | orchestrator | + router_id = (known after apply) 2026-01-06 00:02:23.009930 | orchestrator | + subnet_id = (known after apply) 2026-01-06 00:02:23.009934 | orchestrator | } 2026-01-06 00:02:23.009938 | orchestrator | 2026-01-06 00:02:23.009942 | orchestrator | # openstack_networking_router_v2.router will be created 2026-01-06 00:02:23.009946 | orchestrator | + resource "openstack_networking_router_v2" "router" { 2026-01-06 00:02:23.009950 | orchestrator | + admin_state_up = (known after apply) 2026-01-06 00:02:23.009953 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.009957 | orchestrator | + availability_zone_hints = [ 2026-01-06 00:02:23.009961 | orchestrator | + "nova", 2026-01-06 00:02:23.009965 | orchestrator | ] 2026-01-06 00:02:23.009969 | orchestrator | + distributed = (known after apply) 2026-01-06 00:02:23.009973 | orchestrator | + enable_snat = (known after apply) 2026-01-06 00:02:23.009977 | orchestrator | + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2026-01-06 00:02:23.009981 | orchestrator | + external_qos_policy_id = (known after apply) 2026-01-06 00:02:23.009985 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.009989 | orchestrator | + name = "testbed" 2026-01-06 00:02:23.009993 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.009996 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.010000 | orchestrator | 2026-01-06 00:02:23.010004 | orchestrator | + external_fixed_ip (known after apply) 2026-01-06 00:02:23.010008 | orchestrator | } 2026-01-06 00:02:23.010038 | orchestrator | 2026-01-06 00:02:23.010043 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2026-01-06 00:02:23.010047 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2026-01-06 00:02:23.010051 | orchestrator | + description = "ssh" 2026-01-06 00:02:23.010055 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.010059 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.010068 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.010072 | orchestrator | + port_range_max = 22 2026-01-06 00:02:23.010076 | orchestrator | + port_range_min = 22 2026-01-06 00:02:23.010080 | orchestrator | + protocol = "tcp" 2026-01-06 00:02:23.010083 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.010092 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.010096 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.010100 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.010104 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.010107 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.010111 | orchestrator | } 2026-01-06 00:02:23.010579 | orchestrator | 2026-01-06 00:02:23.010588 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2026-01-06 00:02:23.010593 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2026-01-06 00:02:23.010597 | orchestrator | + description = "wireguard" 2026-01-06 00:02:23.010601 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.010605 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.010609 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.010613 | orchestrator | + port_range_max = 51820 2026-01-06 00:02:23.010617 | orchestrator | + port_range_min = 51820 2026-01-06 00:02:23.010620 | orchestrator | + protocol = "udp" 2026-01-06 00:02:23.010624 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.010628 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.010632 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.010636 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.010640 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.010644 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.010648 | orchestrator | } 2026-01-06 00:02:23.010696 | orchestrator | 2026-01-06 00:02:23.010701 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2026-01-06 00:02:23.010705 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2026-01-06 00:02:23.010709 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.010713 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.010717 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.010720 | orchestrator | + protocol = "tcp" 2026-01-06 00:02:23.010724 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.010728 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.010732 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.010736 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-01-06 00:02:23.010739 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.010743 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.010747 | orchestrator | } 2026-01-06 00:02:23.012740 | orchestrator | 2026-01-06 00:02:23.012761 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2026-01-06 00:02:23.012766 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2026-01-06 00:02:23.012771 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.012775 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.012779 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.012784 | orchestrator | + protocol = "udp" 2026-01-06 00:02:23.012787 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.012791 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.012795 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.012799 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-01-06 00:02:23.012803 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.012807 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.012810 | orchestrator | } 2026-01-06 00:02:23.012818 | orchestrator | 2026-01-06 00:02:23.012822 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2026-01-06 00:02:23.012835 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2026-01-06 00:02:23.012839 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.012843 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.012846 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.012850 | orchestrator | + protocol = "icmp" 2026-01-06 00:02:23.012854 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.012858 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.012861 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.012865 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.012869 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.012873 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.012877 | orchestrator | } 2026-01-06 00:02:23.012882 | orchestrator | 2026-01-06 00:02:23.012886 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2026-01-06 00:02:23.012890 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2026-01-06 00:02:23.012894 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.012898 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.012902 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.012906 | orchestrator | + protocol = "tcp" 2026-01-06 00:02:23.012910 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.012914 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.012922 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.012926 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.012930 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.012934 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.012938 | orchestrator | } 2026-01-06 00:02:23.012947 | orchestrator | 2026-01-06 00:02:23.012951 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2026-01-06 00:02:23.012955 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2026-01-06 00:02:23.012958 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.012962 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.012966 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.012970 | orchestrator | + protocol = "udp" 2026-01-06 00:02:23.012974 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.012977 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.012981 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.012985 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.012989 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.012993 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.012997 | orchestrator | } 2026-01-06 00:02:23.013002 | orchestrator | 2026-01-06 00:02:23.013006 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2026-01-06 00:02:23.013010 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2026-01-06 00:02:23.013026 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.013032 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.013036 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013040 | orchestrator | + protocol = "icmp" 2026-01-06 00:02:23.013044 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.013047 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.013051 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.013055 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.013059 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.013063 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.013070 | orchestrator | } 2026-01-06 00:02:23.013077 | orchestrator | 2026-01-06 00:02:23.013080 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2026-01-06 00:02:23.013084 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2026-01-06 00:02:23.013088 | orchestrator | + description = "vrrp" 2026-01-06 00:02:23.013092 | orchestrator | + direction = "ingress" 2026-01-06 00:02:23.013096 | orchestrator | + ethertype = "IPv4" 2026-01-06 00:02:23.013100 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013104 | orchestrator | + protocol = "112" 2026-01-06 00:02:23.013107 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.013111 | orchestrator | + remote_address_group_id = (known after apply) 2026-01-06 00:02:23.013115 | orchestrator | + remote_group_id = (known after apply) 2026-01-06 00:02:23.013119 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-01-06 00:02:23.013123 | orchestrator | + security_group_id = (known after apply) 2026-01-06 00:02:23.013127 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.013130 | orchestrator | } 2026-01-06 00:02:23.013134 | orchestrator | 2026-01-06 00:02:23.013138 | orchestrator | # openstack_networking_secgroup_v2.security_group_management will be created 2026-01-06 00:02:23.013142 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_management" { 2026-01-06 00:02:23.013146 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.013150 | orchestrator | + description = "management security group" 2026-01-06 00:02:23.013154 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013157 | orchestrator | + name = "testbed-management" 2026-01-06 00:02:23.013161 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.013165 | orchestrator | + stateful = (known after apply) 2026-01-06 00:02:23.013169 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.013172 | orchestrator | } 2026-01-06 00:02:23.013178 | orchestrator | 2026-01-06 00:02:23.013182 | orchestrator | # openstack_networking_secgroup_v2.security_group_node will be created 2026-01-06 00:02:23.013186 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_node" { 2026-01-06 00:02:23.013190 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.013193 | orchestrator | + description = "node security group" 2026-01-06 00:02:23.013197 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013201 | orchestrator | + name = "testbed-node" 2026-01-06 00:02:23.013205 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.013209 | orchestrator | + stateful = (known after apply) 2026-01-06 00:02:23.013212 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.013216 | orchestrator | } 2026-01-06 00:02:23.013222 | orchestrator | 2026-01-06 00:02:23.013226 | orchestrator | # openstack_networking_subnet_v2.subnet_management will be created 2026-01-06 00:02:23.013230 | orchestrator | + resource "openstack_networking_subnet_v2" "subnet_management" { 2026-01-06 00:02:23.013234 | orchestrator | + all_tags = (known after apply) 2026-01-06 00:02:23.013237 | orchestrator | + cidr = "192.168.16.0/20" 2026-01-06 00:02:23.013241 | orchestrator | + dns_nameservers = [ 2026-01-06 00:02:23.013245 | orchestrator | + "8.8.8.8", 2026-01-06 00:02:23.013249 | orchestrator | + "9.9.9.9", 2026-01-06 00:02:23.013253 | orchestrator | ] 2026-01-06 00:02:23.013257 | orchestrator | + enable_dhcp = true 2026-01-06 00:02:23.013261 | orchestrator | + gateway_ip = (known after apply) 2026-01-06 00:02:23.013265 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013269 | orchestrator | + ip_version = 4 2026-01-06 00:02:23.013273 | orchestrator | + ipv6_address_mode = (known after apply) 2026-01-06 00:02:23.013276 | orchestrator | + ipv6_ra_mode = (known after apply) 2026-01-06 00:02:23.013280 | orchestrator | + name = "subnet-testbed-management" 2026-01-06 00:02:23.013284 | orchestrator | + network_id = (known after apply) 2026-01-06 00:02:23.013288 | orchestrator | + no_gateway = false 2026-01-06 00:02:23.013292 | orchestrator | + region = (known after apply) 2026-01-06 00:02:23.013296 | orchestrator | + service_types = (known after apply) 2026-01-06 00:02:23.013302 | orchestrator | + tenant_id = (known after apply) 2026-01-06 00:02:23.013306 | orchestrator | 2026-01-06 00:02:23.013310 | orchestrator | + allocation_pool { 2026-01-06 00:02:23.013314 | orchestrator | + end = "192.168.31.250" 2026-01-06 00:02:23.013324 | orchestrator | + start = "192.168.31.200" 2026-01-06 00:02:23.013328 | orchestrator | } 2026-01-06 00:02:23.013332 | orchestrator | } 2026-01-06 00:02:23.013338 | orchestrator | 2026-01-06 00:02:23.013341 | orchestrator | # terraform_data.image will be created 2026-01-06 00:02:23.013345 | orchestrator | + resource "terraform_data" "image" { 2026-01-06 00:02:23.013349 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013353 | orchestrator | + input = "Ubuntu 24.04" 2026-01-06 00:02:23.013357 | orchestrator | + output = (known after apply) 2026-01-06 00:02:23.013360 | orchestrator | } 2026-01-06 00:02:23.013364 | orchestrator | 2026-01-06 00:02:23.013368 | orchestrator | # terraform_data.image_node will be created 2026-01-06 00:02:23.013372 | orchestrator | + resource "terraform_data" "image_node" { 2026-01-06 00:02:23.013376 | orchestrator | + id = (known after apply) 2026-01-06 00:02:23.013379 | orchestrator | + input = "Ubuntu 24.04" 2026-01-06 00:02:23.013383 | orchestrator | + output = (known after apply) 2026-01-06 00:02:23.013387 | orchestrator | } 2026-01-06 00:02:23.013391 | orchestrator | 2026-01-06 00:02:23.013395 | orchestrator | Plan: 64 to add, 0 to change, 0 to destroy. 2026-01-06 00:02:23.013398 | orchestrator | 2026-01-06 00:02:23.013402 | orchestrator | Changes to Outputs: 2026-01-06 00:02:23.013406 | orchestrator | + manager_address = (sensitive value) 2026-01-06 00:02:23.013410 | orchestrator | + private_key = (sensitive value) 2026-01-06 00:02:23.132764 | orchestrator | terraform_data.image: Creating... 2026-01-06 00:02:23.132858 | orchestrator | terraform_data.image_node: Creating... 2026-01-06 00:02:23.267901 | orchestrator | terraform_data.image_node: Creation complete after 0s [id=1c12166e-95c3-ceec-547f-be4c5bbd0b7d] 2026-01-06 00:02:23.267990 | orchestrator | terraform_data.image: Creation complete after 0s [id=a1fda625-e1d8-81fb-ce8e-662e3f23112b] 2026-01-06 00:02:23.277717 | orchestrator | data.openstack_images_image_v2.image_node: Reading... 2026-01-06 00:02:23.277785 | orchestrator | data.openstack_images_image_v2.image: Reading... 2026-01-06 00:02:23.303880 | orchestrator | openstack_compute_keypair_v2.key: Creating... 2026-01-06 00:02:23.305130 | orchestrator | openstack_networking_network_v2.net_management: Creating... 2026-01-06 00:02:23.342109 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2026-01-06 00:02:23.342171 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2026-01-06 00:02:23.346503 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2026-01-06 00:02:23.347368 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2026-01-06 00:02:23.358934 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2026-01-06 00:02:23.358974 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2026-01-06 00:02:23.798657 | orchestrator | data.openstack_images_image_v2.image: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-01-06 00:02:23.808117 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2026-01-06 00:02:23.808167 | orchestrator | data.openstack_images_image_v2.image_node: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-01-06 00:02:23.814622 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2026-01-06 00:02:23.832483 | orchestrator | openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2026-01-06 00:02:23.835246 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2026-01-06 00:02:24.464888 | orchestrator | openstack_networking_network_v2.net_management: Creation complete after 1s [id=76ac0a0b-10d4-4665-82db-54f3cca511e7] 2026-01-06 00:02:24.472406 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2026-01-06 00:02:26.953850 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 4s [id=8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6] 2026-01-06 00:02:26.964195 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2026-01-06 00:02:26.975549 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 4s [id=3d039a44-dced-4ba6-a79b-af7290a238ac] 2026-01-06 00:02:26.985752 | orchestrator | local_file.id_rsa_pub: Creating... 2026-01-06 00:02:26.990189 | orchestrator | local_file.id_rsa_pub: Creation complete after 0s [id=819f2945c05e00aa58cc58e462e724eb4787f7a4] 2026-01-06 00:02:26.995927 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2026-01-06 00:02:27.006981 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 4s [id=dc9d4d24-a01d-4baf-85b5-da8c88609604] 2026-01-06 00:02:27.007551 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 4s [id=2e071fd2-3317-4a54-af1f-e9b7971267a3] 2026-01-06 00:02:27.022598 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2026-01-06 00:02:27.022674 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2026-01-06 00:02:27.024309 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 4s [id=ea69e1b5-a504-41c3-bb3a-5961a07ea8a6] 2026-01-06 00:02:27.033164 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2026-01-06 00:02:27.042264 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 4s [id=a9899c49-22e0-485a-be63-69bc9e218eb5] 2026-01-06 00:02:27.049141 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2026-01-06 00:02:27.108924 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 3s [id=f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59] 2026-01-06 00:02:27.113852 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 3s [id=724a4878-ca4e-4a20-84cd-e8427809d585] 2026-01-06 00:02:27.118485 | orchestrator | local_sensitive_file.id_rsa: Creating... 2026-01-06 00:02:27.124335 | orchestrator | local_sensitive_file.id_rsa: Creation complete after 0s [id=3684904a331b5d74af0cebecbb3ebb5503b6d555] 2026-01-06 00:02:27.124373 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 3s [id=d326b17f-2106-48eb-aaa2-fe8346fab088] 2026-01-06 00:02:27.127346 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creating... 2026-01-06 00:02:27.835123 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 4s [id=0d6109b0-76d6-4029-bdde-f641f6e77ce6] 2026-01-06 00:02:28.385134 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creation complete after 1s [id=859970b7-8327-44d9-bafc-acafd84d21f1] 2026-01-06 00:02:28.392105 | orchestrator | openstack_networking_router_v2.router: Creating... 2026-01-06 00:02:30.348027 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 3s [id=9143344a-f7a0-4978-a962-686e689e6a1f] 2026-01-06 00:02:30.398156 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 3s [id=47504f77-6654-4579-a6ab-2ab6ea64e907] 2026-01-06 00:02:30.419984 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 3s [id=f5c4e88c-4c87-4f6b-a240-eabfb6d80c22] 2026-01-06 00:02:30.469094 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 3s [id=7779f290-0ef0-4a1b-8fc8-5ce02b31935f] 2026-01-06 00:02:30.490747 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 3s [id=3d7a3417-f72a-4d7a-b186-e6ea45f4b5be] 2026-01-06 00:02:30.512195 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 4s [id=a80b48fc-f175-43ec-b2c4-9074b67ccf1a] 2026-01-06 00:02:32.101318 | orchestrator | openstack_networking_router_v2.router: Creation complete after 4s [id=1f5cb48f-d70f-4444-95a9-90d0dde8f976] 2026-01-06 00:02:32.243214 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creating... 2026-01-06 00:02:32.243392 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creating... 2026-01-06 00:02:32.243413 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creating... 2026-01-06 00:02:32.483638 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=94e66ddf-caa1-423a-9414-beaad762d0f8] 2026-01-06 00:02:32.500079 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2026-01-06 00:02:32.500375 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2026-01-06 00:02:32.501387 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2026-01-06 00:02:32.504365 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creating... 2026-01-06 00:02:32.509959 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creating... 2026-01-06 00:02:32.513132 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2026-01-06 00:02:32.515629 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creating... 2026-01-06 00:02:32.517473 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creating... 2026-01-06 00:02:32.544452 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creation complete after 1s [id=2fdd7d4c-3800-46ca-9853-fb52e658f5a4] 2026-01-06 00:02:32.553717 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creating... 2026-01-06 00:02:32.715616 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 0s [id=54d2d9f3-d814-45be-a27a-423a5e5714f8] 2026-01-06 00:02:32.726829 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creating... 2026-01-06 00:02:33.156496 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 0s [id=b8531241-09eb-4332-80a5-48124349a717] 2026-01-06 00:02:33.163065 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2026-01-06 00:02:33.255768 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creation complete after 0s [id=be9edca2-dc5a-4515-864d-9930325f2d2b] 2026-01-06 00:02:33.258767 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creation complete after 0s [id=01b61747-5c06-46f4-b949-ac0c8952d99d] 2026-01-06 00:02:33.262519 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2026-01-06 00:02:33.264721 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2026-01-06 00:02:33.348738 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creation complete after 0s [id=775de5b0-d0a8-4e00-a356-ce0b05a373fa] 2026-01-06 00:02:33.355102 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2026-01-06 00:02:33.443871 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creation complete after 0s [id=03f067b1-7721-4c19-b506-3a7648cef086] 2026-01-06 00:02:33.449647 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2026-01-06 00:02:33.473353 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 0s [id=c1943424-2b85-4400-993c-b6b15ad216b1] 2026-01-06 00:02:33.481437 | orchestrator | openstack_networking_port_v2.manager_port_management: Creating... 2026-01-06 00:02:33.677122 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 1s [id=0256e293-f1f2-4b83-a15b-481904f28551] 2026-01-06 00:02:33.900869 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creation complete after 1s [id=b6541ad3-79a3-4403-a4cd-a0fbf6c90b39] 2026-01-06 00:02:33.988863 | orchestrator | openstack_networking_port_v2.manager_port_management: Creation complete after 1s [id=5b6da3b4-63c8-417b-a706-ce0373deb712] 2026-01-06 00:02:34.095209 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creation complete after 1s [id=c312a59d-46f5-4dd5-b5ba-4c4c85200cba] 2026-01-06 00:02:34.322697 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 1s [id=82451e1e-63ee-41d2-bfa2-1b5f77bc0cf8] 2026-01-06 00:02:34.484353 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 1s [id=ff69f5b4-eda0-476d-8b57-3de8d2c3d285] 2026-01-06 00:02:34.745427 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creation complete after 3s [id=91964bd1-9ce5-4583-8942-4822b20c5502] 2026-01-06 00:02:34.768587 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2026-01-06 00:02:34.779665 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creating... 2026-01-06 00:02:34.780566 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creating... 2026-01-06 00:02:34.783711 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 2s [id=440dcc4c-cc4f-434e-bc66-f77a196770a0] 2026-01-06 00:02:34.783817 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creating... 2026-01-06 00:02:34.784474 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creating... 2026-01-06 00:02:34.793244 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creating... 2026-01-06 00:02:34.804814 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creating... 2026-01-06 00:02:34.965942 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 2s [id=2c3a87f9-9537-4c78-a911-8bcd6bf37ee1] 2026-01-06 00:02:35.157740 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 2s [id=0c2b3226-abd0-46c6-af78-91283c8e3249] 2026-01-06 00:02:38.208582 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 3s [id=43b98060-f345-46a7-a02a-7ce80c2e7d8b] 2026-01-06 00:02:38.217989 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2026-01-06 00:02:38.227059 | orchestrator | local_file.MANAGER_ADDRESS: Creating... 2026-01-06 00:02:38.233047 | orchestrator | local_file.MANAGER_ADDRESS: Creation complete after 0s [id=17d1a1607290eee758ea92d560feb33ec96ead31] 2026-01-06 00:02:38.233764 | orchestrator | local_file.inventory: Creating... 2026-01-06 00:02:38.239284 | orchestrator | local_file.inventory: Creation complete after 0s [id=bac132a5e6e11eb2ec709cd02122b160d1ed139c] 2026-01-06 00:02:39.813966 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 2s [id=43b98060-f345-46a7-a02a-7ce80c2e7d8b] 2026-01-06 00:02:44.785094 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2026-01-06 00:02:44.785217 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2026-01-06 00:02:44.785240 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2026-01-06 00:02:44.787529 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2026-01-06 00:02:44.800939 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2026-01-06 00:02:44.805102 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2026-01-06 00:02:54.792999 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2026-01-06 00:02:54.793177 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2026-01-06 00:02:54.793207 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2026-01-06 00:02:54.793220 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2026-01-06 00:02:54.801385 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2026-01-06 00:02:54.805830 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2026-01-06 00:03:04.800910 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2026-01-06 00:03:04.801101 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2026-01-06 00:03:04.801124 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [30s elapsed] 2026-01-06 00:03:04.801144 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [30s elapsed] 2026-01-06 00:03:04.802289 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [30s elapsed] 2026-01-06 00:03:04.806621 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2026-01-06 00:03:06.328591 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creation complete after 31s [id=75069817-7c8f-4b65-bf20-19bd1efc572f] 2026-01-06 00:03:06.368934 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creation complete after 31s [id=eaacb4dc-c49d-4e57-86c9-d6e20c77c1fe] 2026-01-06 00:03:06.549789 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creation complete after 32s [id=1b7a4b4b-02f0-49fd-812e-ef32a9b99d15] 2026-01-06 00:03:06.698727 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creation complete after 32s [id=6f0f18cb-74d1-40a8-bcde-2504dd889323] 2026-01-06 00:03:07.267948 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creation complete after 32s [id=4f3376d9-d133-4419-88e1-35fdc4139926] 2026-01-06 00:03:14.801570 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [40s elapsed] 2026-01-06 00:03:17.385619 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creation complete after 42s [id=b51305d2-db5e-43bb-a2b1-4164598628b5] 2026-01-06 00:03:17.416200 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2026-01-06 00:03:17.416615 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2026-01-06 00:03:17.425283 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2026-01-06 00:03:17.426316 | orchestrator | null_resource.node_semaphore: Creating... 2026-01-06 00:03:17.428459 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2026-01-06 00:03:17.435662 | orchestrator | null_resource.node_semaphore: Creation complete after 0s [id=7414346611480315218] 2026-01-06 00:03:17.436639 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2026-01-06 00:03:17.449887 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2026-01-06 00:03:17.450940 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2026-01-06 00:03:17.458856 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2026-01-06 00:03:17.466827 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2026-01-06 00:03:17.470670 | orchestrator | openstack_compute_instance_v2.manager_server: Creating... 2026-01-06 00:03:20.841558 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 4s [id=4f3376d9-d133-4419-88e1-35fdc4139926/d326b17f-2106-48eb-aaa2-fe8346fab088] 2026-01-06 00:03:20.846679 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 4s [id=eaacb4dc-c49d-4e57-86c9-d6e20c77c1fe/ea69e1b5-a504-41c3-bb3a-5961a07ea8a6] 2026-01-06 00:03:21.016829 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 4s [id=b51305d2-db5e-43bb-a2b1-4164598628b5/f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59] 2026-01-06 00:03:21.132434 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 4s [id=b51305d2-db5e-43bb-a2b1-4164598628b5/2e071fd2-3317-4a54-af1f-e9b7971267a3] 2026-01-06 00:03:22.290825 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 5s [id=eaacb4dc-c49d-4e57-86c9-d6e20c77c1fe/8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6] 2026-01-06 00:03:22.325450 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 5s [id=4f3376d9-d133-4419-88e1-35fdc4139926/3d039a44-dced-4ba6-a79b-af7290a238ac] 2026-01-06 00:03:22.364818 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 5s [id=eaacb4dc-c49d-4e57-86c9-d6e20c77c1fe/724a4878-ca4e-4a20-84cd-e8427809d585] 2026-01-06 00:03:22.389910 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 5s [id=4f3376d9-d133-4419-88e1-35fdc4139926/dc9d4d24-a01d-4baf-85b5-da8c88609604] 2026-01-06 00:03:27.333197 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 10s [id=b51305d2-db5e-43bb-a2b1-4164598628b5/a9899c49-22e0-485a-be63-69bc9e218eb5] 2026-01-06 00:03:27.474719 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2026-01-06 00:03:37.484007 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2026-01-06 00:03:38.063260 | orchestrator | openstack_compute_instance_v2.manager_server: Creation complete after 21s [id=84f7ef25-a673-499a-8e6c-d5da670bc1a4] 2026-01-06 00:03:38.075099 | orchestrator | 2026-01-06 00:03:38.075178 | orchestrator | Apply complete! Resources: 64 added, 0 changed, 0 destroyed. 2026-01-06 00:03:38.075189 | orchestrator | 2026-01-06 00:03:38.075194 | orchestrator | Outputs: 2026-01-06 00:03:38.075198 | orchestrator | 2026-01-06 00:03:38.075202 | orchestrator | manager_address = 2026-01-06 00:03:38.075206 | orchestrator | private_key = 2026-01-06 00:03:38.427241 | orchestrator | ok: Runtime: 0:01:21.306638 2026-01-06 00:03:38.458922 | 2026-01-06 00:03:38.459072 | TASK [Create infrastructure (stable)] 2026-01-06 00:03:38.992091 | orchestrator | skipping: Conditional result was False 2026-01-06 00:03:39.021280 | 2026-01-06 00:03:39.021546 | TASK [Fetch manager address] 2026-01-06 00:03:39.543263 | orchestrator | ok 2026-01-06 00:03:39.551842 | 2026-01-06 00:03:39.551973 | TASK [Set manager_host address] 2026-01-06 00:03:39.635075 | orchestrator | ok 2026-01-06 00:03:39.647562 | 2026-01-06 00:03:39.647817 | LOOP [Update ansible collections] 2026-01-06 00:03:40.883647 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-01-06 00:03:40.884124 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-01-06 00:03:40.884182 | orchestrator | Starting galaxy collection install process 2026-01-06 00:03:40.884218 | orchestrator | Process install dependency map 2026-01-06 00:03:40.884249 | orchestrator | Starting collection install process 2026-01-06 00:03:40.884278 | orchestrator | Installing 'osism.commons:999.0.0' to '/home/zuul-testbed06/.ansible/collections/ansible_collections/osism/commons' 2026-01-06 00:03:40.884314 | orchestrator | Created collection for osism.commons:999.0.0 at /home/zuul-testbed06/.ansible/collections/ansible_collections/osism/commons 2026-01-06 00:03:40.884358 | orchestrator | osism.commons:999.0.0 was installed successfully 2026-01-06 00:03:40.884427 | orchestrator | ok: Item: commons Runtime: 0:00:00.855325 2026-01-06 00:03:41.990935 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-01-06 00:03:41.991366 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-01-06 00:03:41.991615 | orchestrator | Starting galaxy collection install process 2026-01-06 00:03:41.991739 | orchestrator | Process install dependency map 2026-01-06 00:03:41.991815 | orchestrator | Starting collection install process 2026-01-06 00:03:41.991951 | orchestrator | Installing 'osism.services:999.0.0' to '/home/zuul-testbed06/.ansible/collections/ansible_collections/osism/services' 2026-01-06 00:03:41.992020 | orchestrator | Created collection for osism.services:999.0.0 at /home/zuul-testbed06/.ansible/collections/ansible_collections/osism/services 2026-01-06 00:03:41.992130 | orchestrator | osism.services:999.0.0 was installed successfully 2026-01-06 00:03:41.992251 | orchestrator | ok: Item: services Runtime: 0:00:00.817404 2026-01-06 00:03:42.020569 | 2026-01-06 00:03:42.020878 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-01-06 00:03:52.670974 | orchestrator | ok 2026-01-06 00:03:52.680153 | 2026-01-06 00:03:52.680286 | TASK [Wait a little longer for the manager so that everything is ready] 2026-01-06 00:04:52.730739 | orchestrator | ok 2026-01-06 00:04:52.741101 | 2026-01-06 00:04:52.741221 | TASK [Fetch manager ssh hostkey] 2026-01-06 00:04:54.345144 | orchestrator | Output suppressed because no_log was given 2026-01-06 00:04:54.361170 | 2026-01-06 00:04:54.361379 | TASK [Get ssh keypair from terraform environment] 2026-01-06 00:04:54.907258 | orchestrator | ok: Runtime: 0:00:00.008154 2026-01-06 00:04:54.923355 | 2026-01-06 00:04:54.923513 | TASK [Point out that the following task takes some time and does not give any output] 2026-01-06 00:04:54.967627 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2026-01-06 00:04:54.975937 | 2026-01-06 00:04:54.976118 | TASK [Run manager part 0] 2026-01-06 00:04:55.995340 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-01-06 00:04:56.065953 | orchestrator | 2026-01-06 00:04:56.066011 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2026-01-06 00:04:56.066057 | orchestrator | 2026-01-06 00:04:56.066075 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2026-01-06 00:04:57.983927 | orchestrator | ok: [testbed-manager] 2026-01-06 00:04:57.984017 | orchestrator | 2026-01-06 00:04:57.984049 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-01-06 00:04:57.984062 | orchestrator | 2026-01-06 00:04:57.984073 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:04:59.968678 | orchestrator | ok: [testbed-manager] 2026-01-06 00:04:59.968728 | orchestrator | 2026-01-06 00:04:59.968739 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-01-06 00:05:00.599623 | orchestrator | ok: [testbed-manager] 2026-01-06 00:05:00.599668 | orchestrator | 2026-01-06 00:05:00.599675 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-01-06 00:05:00.640240 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.640283 | orchestrator | 2026-01-06 00:05:00.640292 | orchestrator | TASK [Update package cache] **************************************************** 2026-01-06 00:05:00.667265 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.667314 | orchestrator | 2026-01-06 00:05:00.667322 | orchestrator | TASK [Install required packages] *********************************************** 2026-01-06 00:05:00.695176 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.695228 | orchestrator | 2026-01-06 00:05:00.695236 | orchestrator | TASK [Remove some python packages] ********************************************* 2026-01-06 00:05:00.725230 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.725281 | orchestrator | 2026-01-06 00:05:00.725289 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2026-01-06 00:05:00.753005 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.753059 | orchestrator | 2026-01-06 00:05:00.753072 | orchestrator | TASK [Fail if Ubuntu version is lower than 24.04] ****************************** 2026-01-06 00:05:00.780291 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.780340 | orchestrator | 2026-01-06 00:05:00.780351 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2026-01-06 00:05:00.807131 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:05:00.807174 | orchestrator | 2026-01-06 00:05:00.807181 | orchestrator | TASK [Set APT options on manager] ********************************************** 2026-01-06 00:05:01.568623 | orchestrator | changed: [testbed-manager] 2026-01-06 00:05:01.568695 | orchestrator | 2026-01-06 00:05:01.568707 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2026-01-06 00:07:51.152109 | orchestrator | changed: [testbed-manager] 2026-01-06 00:07:51.152221 | orchestrator | 2026-01-06 00:07:51.152251 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-01-06 00:09:15.962504 | orchestrator | changed: [testbed-manager] 2026-01-06 00:09:15.962577 | orchestrator | 2026-01-06 00:09:15.962594 | orchestrator | TASK [Install required packages] *********************************************** 2026-01-06 00:09:42.037152 | orchestrator | changed: [testbed-manager] 2026-01-06 00:09:42.037234 | orchestrator | 2026-01-06 00:09:42.037254 | orchestrator | TASK [Remove some python packages] ********************************************* 2026-01-06 00:09:53.189514 | orchestrator | changed: [testbed-manager] 2026-01-06 00:09:53.189619 | orchestrator | 2026-01-06 00:09:53.189693 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-01-06 00:09:53.240391 | orchestrator | ok: [testbed-manager] 2026-01-06 00:09:53.240484 | orchestrator | 2026-01-06 00:09:53.240507 | orchestrator | TASK [Get current user] ******************************************************** 2026-01-06 00:09:54.135198 | orchestrator | ok: [testbed-manager] 2026-01-06 00:09:54.135446 | orchestrator | 2026-01-06 00:09:54.135616 | orchestrator | TASK [Create venv directory] *************************************************** 2026-01-06 00:09:54.969834 | orchestrator | changed: [testbed-manager] 2026-01-06 00:09:54.970774 | orchestrator | 2026-01-06 00:09:54.970819 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2026-01-06 00:10:01.441189 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:01.441254 | orchestrator | 2026-01-06 00:10:01.441296 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2026-01-06 00:10:08.253643 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:08.253757 | orchestrator | 2026-01-06 00:10:08.253776 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2026-01-06 00:10:10.956713 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:10.956783 | orchestrator | 2026-01-06 00:10:10.956800 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2026-01-06 00:10:12.793174 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:12.793219 | orchestrator | 2026-01-06 00:10:12.793230 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2026-01-06 00:10:13.945317 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-01-06 00:10:13.945349 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-01-06 00:10:13.945355 | orchestrator | 2026-01-06 00:10:13.945360 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2026-01-06 00:10:13.988083 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-01-06 00:10:13.988143 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-01-06 00:10:13.988154 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-01-06 00:10:13.988163 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-01-06 00:10:20.992527 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-01-06 00:10:20.992625 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-01-06 00:10:20.992673 | orchestrator | 2026-01-06 00:10:20.992690 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2026-01-06 00:10:21.577781 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:21.577894 | orchestrator | 2026-01-06 00:10:21.577920 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2026-01-06 00:10:44.652993 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2026-01-06 00:10:44.653049 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2026-01-06 00:10:44.653057 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2026-01-06 00:10:44.653062 | orchestrator | 2026-01-06 00:10:44.653067 | orchestrator | TASK [Install local collections] *********************************************** 2026-01-06 00:10:47.075854 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2026-01-06 00:10:47.075891 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2026-01-06 00:10:47.075896 | orchestrator | 2026-01-06 00:10:47.075901 | orchestrator | PLAY [Create operator user] **************************************************** 2026-01-06 00:10:47.075907 | orchestrator | 2026-01-06 00:10:47.075911 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:10:48.437705 | orchestrator | ok: [testbed-manager] 2026-01-06 00:10:48.437803 | orchestrator | 2026-01-06 00:10:48.437822 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-01-06 00:10:48.494523 | orchestrator | ok: [testbed-manager] 2026-01-06 00:10:48.494666 | orchestrator | 2026-01-06 00:10:48.494685 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-01-06 00:10:48.574201 | orchestrator | ok: [testbed-manager] 2026-01-06 00:10:48.574310 | orchestrator | 2026-01-06 00:10:48.574327 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-01-06 00:10:49.403099 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:49.403173 | orchestrator | 2026-01-06 00:10:49.403184 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-01-06 00:10:50.164764 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:50.164843 | orchestrator | 2026-01-06 00:10:50.164852 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-01-06 00:10:51.572213 | orchestrator | changed: [testbed-manager] => (item=adm) 2026-01-06 00:10:51.572313 | orchestrator | changed: [testbed-manager] => (item=sudo) 2026-01-06 00:10:51.572328 | orchestrator | 2026-01-06 00:10:51.572361 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-01-06 00:10:53.042203 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:53.042284 | orchestrator | 2026-01-06 00:10:53.042294 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-01-06 00:10:54.839981 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:10:54.840073 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2026-01-06 00:10:54.840084 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:10:54.840093 | orchestrator | 2026-01-06 00:10:54.840104 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-01-06 00:10:54.900390 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:54.900521 | orchestrator | 2026-01-06 00:10:54.900548 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-01-06 00:10:54.998514 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:54.998590 | orchestrator | 2026-01-06 00:10:54.998603 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-01-06 00:10:55.556273 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:55.556316 | orchestrator | 2026-01-06 00:10:55.556324 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-01-06 00:10:55.632931 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:55.632977 | orchestrator | 2026-01-06 00:10:55.632985 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-01-06 00:10:56.531077 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:10:56.531177 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:56.531194 | orchestrator | 2026-01-06 00:10:56.531206 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-01-06 00:10:56.575134 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:56.575204 | orchestrator | 2026-01-06 00:10:56.575212 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-01-06 00:10:56.619474 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:56.619666 | orchestrator | 2026-01-06 00:10:56.619676 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-01-06 00:10:56.660005 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:56.660108 | orchestrator | 2026-01-06 00:10:56.660128 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-01-06 00:10:56.736926 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:10:56.737042 | orchestrator | 2026-01-06 00:10:56.737061 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-01-06 00:10:57.472761 | orchestrator | ok: [testbed-manager] 2026-01-06 00:10:57.472862 | orchestrator | 2026-01-06 00:10:57.472878 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-01-06 00:10:57.472891 | orchestrator | 2026-01-06 00:10:57.472903 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:10:58.889520 | orchestrator | ok: [testbed-manager] 2026-01-06 00:10:58.889688 | orchestrator | 2026-01-06 00:10:58.889721 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2026-01-06 00:10:59.870819 | orchestrator | changed: [testbed-manager] 2026-01-06 00:10:59.871031 | orchestrator | 2026-01-06 00:10:59.871049 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:10:59.871062 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=14 rescued=0 ignored=0 2026-01-06 00:10:59.871073 | orchestrator | 2026-01-06 00:11:00.264097 | orchestrator | ok: Runtime: 0:06:04.688138 2026-01-06 00:11:00.282527 | 2026-01-06 00:11:00.282692 | TASK [Point out that the log in on the manager is now possible] 2026-01-06 00:11:00.319736 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2026-01-06 00:11:00.328999 | 2026-01-06 00:11:00.329162 | TASK [Point out that the following task takes some time and does not give any output] 2026-01-06 00:11:00.368707 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2026-01-06 00:11:00.379056 | 2026-01-06 00:11:00.379211 | TASK [Run manager part 1 + 2] 2026-01-06 00:11:02.078882 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-01-06 00:11:02.140214 | orchestrator | 2026-01-06 00:11:02.140332 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2026-01-06 00:11:02.140351 | orchestrator | 2026-01-06 00:11:02.140404 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:11:04.727412 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:04.727614 | orchestrator | 2026-01-06 00:11:04.727699 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2026-01-06 00:11:04.774076 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:11:04.774144 | orchestrator | 2026-01-06 00:11:04.774158 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-01-06 00:11:04.811187 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:04.811300 | orchestrator | 2026-01-06 00:11:04.811327 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-01-06 00:11:04.862230 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:04.862326 | orchestrator | 2026-01-06 00:11:04.862347 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-01-06 00:11:04.948864 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:04.948952 | orchestrator | 2026-01-06 00:11:04.948971 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-01-06 00:11:05.030995 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:05.031084 | orchestrator | 2026-01-06 00:11:05.031102 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-01-06 00:11:05.099363 | orchestrator | included: /home/zuul-testbed06/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2026-01-06 00:11:05.099452 | orchestrator | 2026-01-06 00:11:05.099467 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-01-06 00:11:05.872186 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:05.872261 | orchestrator | 2026-01-06 00:11:05.872276 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-01-06 00:11:05.933823 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:11:05.933891 | orchestrator | 2026-01-06 00:11:05.933900 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-01-06 00:11:07.372564 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:07.372673 | orchestrator | 2026-01-06 00:11:07.372693 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-01-06 00:11:07.958600 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:07.958745 | orchestrator | 2026-01-06 00:11:07.958762 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-01-06 00:11:09.130356 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:09.130453 | orchestrator | 2026-01-06 00:11:09.130472 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-01-06 00:11:25.428092 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:25.428184 | orchestrator | 2026-01-06 00:11:25.428199 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-01-06 00:11:26.116357 | orchestrator | ok: [testbed-manager] 2026-01-06 00:11:26.116435 | orchestrator | 2026-01-06 00:11:26.116448 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-01-06 00:11:26.169946 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:11:26.170066 | orchestrator | 2026-01-06 00:11:26.170084 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2026-01-06 00:11:27.135019 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:27.135115 | orchestrator | 2026-01-06 00:11:27.135131 | orchestrator | TASK [Copy SSH private key] **************************************************** 2026-01-06 00:11:28.105356 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:28.105422 | orchestrator | 2026-01-06 00:11:28.105429 | orchestrator | TASK [Create configuration directory] ****************************************** 2026-01-06 00:11:28.711512 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:28.711607 | orchestrator | 2026-01-06 00:11:28.711615 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2026-01-06 00:11:28.762107 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-01-06 00:11:28.762193 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-01-06 00:11:28.762199 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-01-06 00:11:28.762205 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-01-06 00:11:31.722932 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:31.722978 | orchestrator | 2026-01-06 00:11:31.722985 | orchestrator | TASK [Install python requirements in venv] ************************************* 2026-01-06 00:11:40.739335 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2026-01-06 00:11:40.739437 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2026-01-06 00:11:40.739455 | orchestrator | ok: [testbed-manager] => (item=packaging) 2026-01-06 00:11:40.739468 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2026-01-06 00:11:40.739490 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2026-01-06 00:11:40.739502 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2026-01-06 00:11:40.739514 | orchestrator | 2026-01-06 00:11:40.739526 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2026-01-06 00:11:41.803231 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:41.803449 | orchestrator | 2026-01-06 00:11:41.803465 | orchestrator | TASK [Copy testbed custom CA certificate on CentOS] **************************** 2026-01-06 00:11:41.840827 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:11:41.840981 | orchestrator | 2026-01-06 00:11:41.840997 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2026-01-06 00:11:45.038447 | orchestrator | changed: [testbed-manager] 2026-01-06 00:11:45.038536 | orchestrator | 2026-01-06 00:11:45.038551 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2026-01-06 00:11:45.078140 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:11:45.078270 | orchestrator | 2026-01-06 00:11:45.078298 | orchestrator | TASK [Run manager part 2] ****************************************************** 2026-01-06 00:13:33.462592 | orchestrator | changed: [testbed-manager] 2026-01-06 00:13:33.462727 | orchestrator | 2026-01-06 00:13:33.462747 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-01-06 00:13:34.653786 | orchestrator | ok: [testbed-manager] 2026-01-06 00:13:34.653893 | orchestrator | 2026-01-06 00:13:34.653911 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:13:34.653927 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 2026-01-06 00:13:34.653942 | orchestrator | 2026-01-06 00:13:35.023569 | orchestrator | ok: Runtime: 0:02:34.047550 2026-01-06 00:13:35.049582 | 2026-01-06 00:13:35.049847 | TASK [Reboot manager] 2026-01-06 00:13:36.593684 | orchestrator | ok: Runtime: 0:00:00.976932 2026-01-06 00:13:36.609002 | 2026-01-06 00:13:36.609179 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-01-06 00:13:53.039146 | orchestrator | ok 2026-01-06 00:13:53.047011 | 2026-01-06 00:13:53.047161 | TASK [Wait a little longer for the manager so that everything is ready] 2026-01-06 00:14:53.094097 | orchestrator | ok 2026-01-06 00:14:53.106315 | 2026-01-06 00:14:53.106499 | TASK [Deploy manager + bootstrap nodes] 2026-01-06 00:14:55.570685 | orchestrator | 2026-01-06 00:14:55.570933 | orchestrator | # DEPLOY MANAGER 2026-01-06 00:14:55.570958 | orchestrator | 2026-01-06 00:14:55.570973 | orchestrator | + set -e 2026-01-06 00:14:55.570986 | orchestrator | + echo 2026-01-06 00:14:55.571001 | orchestrator | + echo '# DEPLOY MANAGER' 2026-01-06 00:14:55.571019 | orchestrator | + echo 2026-01-06 00:14:55.571069 | orchestrator | + cat /opt/manager-vars.sh 2026-01-06 00:14:55.575754 | orchestrator | export NUMBER_OF_NODES=6 2026-01-06 00:14:55.575801 | orchestrator | 2026-01-06 00:14:55.575823 | orchestrator | export CEPH_VERSION=reef 2026-01-06 00:14:55.575838 | orchestrator | export CONFIGURATION_VERSION=main 2026-01-06 00:14:55.575851 | orchestrator | export MANAGER_VERSION=latest 2026-01-06 00:14:55.575876 | orchestrator | export OPENSTACK_VERSION=2025.1 2026-01-06 00:14:55.575887 | orchestrator | 2026-01-06 00:14:55.575906 | orchestrator | export ARA=false 2026-01-06 00:14:55.575917 | orchestrator | export DEPLOY_MODE=manager 2026-01-06 00:14:55.575935 | orchestrator | export TEMPEST=true 2026-01-06 00:14:55.575947 | orchestrator | export IS_ZUUL=true 2026-01-06 00:14:55.575958 | orchestrator | 2026-01-06 00:14:55.575976 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:14:55.575988 | orchestrator | export EXTERNAL_API=false 2026-01-06 00:14:55.575999 | orchestrator | 2026-01-06 00:14:55.576010 | orchestrator | export IMAGE_USER=ubuntu 2026-01-06 00:14:55.576024 | orchestrator | export IMAGE_NODE_USER=ubuntu 2026-01-06 00:14:55.576035 | orchestrator | 2026-01-06 00:14:55.576046 | orchestrator | export CEPH_STACK=ceph-ansible 2026-01-06 00:14:55.576120 | orchestrator | 2026-01-06 00:14:55.576134 | orchestrator | + echo 2026-01-06 00:14:55.576147 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-01-06 00:14:55.577273 | orchestrator | ++ export INTERACTIVE=false 2026-01-06 00:14:55.577294 | orchestrator | ++ INTERACTIVE=false 2026-01-06 00:14:55.577307 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-01-06 00:14:55.577319 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-01-06 00:14:55.577843 | orchestrator | + source /opt/manager-vars.sh 2026-01-06 00:14:55.577868 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-01-06 00:14:55.577880 | orchestrator | ++ NUMBER_OF_NODES=6 2026-01-06 00:14:55.577891 | orchestrator | ++ export CEPH_VERSION=reef 2026-01-06 00:14:55.577902 | orchestrator | ++ CEPH_VERSION=reef 2026-01-06 00:14:55.577913 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-01-06 00:14:55.577925 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-01-06 00:14:55.577942 | orchestrator | ++ export MANAGER_VERSION=latest 2026-01-06 00:14:55.577954 | orchestrator | ++ MANAGER_VERSION=latest 2026-01-06 00:14:55.577965 | orchestrator | ++ export OPENSTACK_VERSION=2025.1 2026-01-06 00:14:55.577989 | orchestrator | ++ OPENSTACK_VERSION=2025.1 2026-01-06 00:14:55.578001 | orchestrator | ++ export ARA=false 2026-01-06 00:14:55.578012 | orchestrator | ++ ARA=false 2026-01-06 00:14:55.578100 | orchestrator | ++ export DEPLOY_MODE=manager 2026-01-06 00:14:55.578111 | orchestrator | ++ DEPLOY_MODE=manager 2026-01-06 00:14:55.578122 | orchestrator | ++ export TEMPEST=true 2026-01-06 00:14:55.578133 | orchestrator | ++ TEMPEST=true 2026-01-06 00:14:55.578144 | orchestrator | ++ export IS_ZUUL=true 2026-01-06 00:14:55.578155 | orchestrator | ++ IS_ZUUL=true 2026-01-06 00:14:55.578171 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:14:55.578182 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:14:55.578193 | orchestrator | ++ export EXTERNAL_API=false 2026-01-06 00:14:55.578205 | orchestrator | ++ EXTERNAL_API=false 2026-01-06 00:14:55.578215 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-01-06 00:14:55.578226 | orchestrator | ++ IMAGE_USER=ubuntu 2026-01-06 00:14:55.578238 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-01-06 00:14:55.578249 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-01-06 00:14:55.578260 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-01-06 00:14:55.578275 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-01-06 00:14:55.578286 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2026-01-06 00:14:55.636044 | orchestrator | + docker version 2026-01-06 00:14:55.907288 | orchestrator | Client: Docker Engine - Community 2026-01-06 00:14:55.907407 | orchestrator | Version: 27.5.1 2026-01-06 00:14:55.907423 | orchestrator | API version: 1.47 2026-01-06 00:14:55.907436 | orchestrator | Go version: go1.22.11 2026-01-06 00:14:55.907448 | orchestrator | Git commit: 9f9e405 2026-01-06 00:14:55.907459 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-01-06 00:14:55.907472 | orchestrator | OS/Arch: linux/amd64 2026-01-06 00:14:55.907484 | orchestrator | Context: default 2026-01-06 00:14:55.907495 | orchestrator | 2026-01-06 00:14:55.907507 | orchestrator | Server: Docker Engine - Community 2026-01-06 00:14:55.907518 | orchestrator | Engine: 2026-01-06 00:14:55.907529 | orchestrator | Version: 27.5.1 2026-01-06 00:14:55.907541 | orchestrator | API version: 1.47 (minimum version 1.24) 2026-01-06 00:14:55.907587 | orchestrator | Go version: go1.22.11 2026-01-06 00:14:55.907600 | orchestrator | Git commit: 4c9b3b0 2026-01-06 00:14:55.907619 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-01-06 00:14:55.907637 | orchestrator | OS/Arch: linux/amd64 2026-01-06 00:14:55.907656 | orchestrator | Experimental: false 2026-01-06 00:14:55.907677 | orchestrator | containerd: 2026-01-06 00:14:55.907756 | orchestrator | Version: v2.2.1 2026-01-06 00:14:55.907781 | orchestrator | GitCommit: dea7da592f5d1d2b7755e3a161be07f43fad8f75 2026-01-06 00:14:55.907802 | orchestrator | runc: 2026-01-06 00:14:55.907822 | orchestrator | Version: 1.3.4 2026-01-06 00:14:55.907842 | orchestrator | GitCommit: v1.3.4-0-gd6d73eb8 2026-01-06 00:14:55.907862 | orchestrator | docker-init: 2026-01-06 00:14:55.907882 | orchestrator | Version: 0.19.0 2026-01-06 00:14:55.907902 | orchestrator | GitCommit: de40ad0 2026-01-06 00:14:55.910887 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2026-01-06 00:14:55.920590 | orchestrator | + set -e 2026-01-06 00:14:55.921352 | orchestrator | + source /opt/manager-vars.sh 2026-01-06 00:14:55.921384 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-01-06 00:14:55.921405 | orchestrator | ++ NUMBER_OF_NODES=6 2026-01-06 00:14:55.921423 | orchestrator | ++ export CEPH_VERSION=reef 2026-01-06 00:14:55.921441 | orchestrator | ++ CEPH_VERSION=reef 2026-01-06 00:14:55.921458 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-01-06 00:14:55.921477 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-01-06 00:14:55.921496 | orchestrator | ++ export MANAGER_VERSION=latest 2026-01-06 00:14:55.921515 | orchestrator | ++ MANAGER_VERSION=latest 2026-01-06 00:14:55.921533 | orchestrator | ++ export OPENSTACK_VERSION=2025.1 2026-01-06 00:14:55.921550 | orchestrator | ++ OPENSTACK_VERSION=2025.1 2026-01-06 00:14:55.921568 | orchestrator | ++ export ARA=false 2026-01-06 00:14:55.921586 | orchestrator | ++ ARA=false 2026-01-06 00:14:55.921605 | orchestrator | ++ export DEPLOY_MODE=manager 2026-01-06 00:14:55.921623 | orchestrator | ++ DEPLOY_MODE=manager 2026-01-06 00:14:55.921641 | orchestrator | ++ export TEMPEST=true 2026-01-06 00:14:55.921657 | orchestrator | ++ TEMPEST=true 2026-01-06 00:14:55.921674 | orchestrator | ++ export IS_ZUUL=true 2026-01-06 00:14:55.921691 | orchestrator | ++ IS_ZUUL=true 2026-01-06 00:14:55.921743 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:14:55.921760 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:14:55.921776 | orchestrator | ++ export EXTERNAL_API=false 2026-01-06 00:14:55.921794 | orchestrator | ++ EXTERNAL_API=false 2026-01-06 00:14:55.921810 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-01-06 00:14:55.921828 | orchestrator | ++ IMAGE_USER=ubuntu 2026-01-06 00:14:55.921845 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-01-06 00:14:55.921861 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-01-06 00:14:55.921878 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-01-06 00:14:55.921894 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-01-06 00:14:55.921910 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-01-06 00:14:55.921925 | orchestrator | ++ export INTERACTIVE=false 2026-01-06 00:14:55.921942 | orchestrator | ++ INTERACTIVE=false 2026-01-06 00:14:55.921959 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-01-06 00:14:55.921980 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-01-06 00:14:55.921996 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2026-01-06 00:14:55.922013 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2026-01-06 00:14:55.922103 | orchestrator | + /opt/configuration/scripts/set-ceph-version.sh reef 2026-01-06 00:14:55.926079 | orchestrator | + set -e 2026-01-06 00:14:55.926138 | orchestrator | + VERSION=reef 2026-01-06 00:14:55.926640 | orchestrator | ++ grep '^ceph_version:' /opt/configuration/environments/manager/configuration.yml 2026-01-06 00:14:55.933632 | orchestrator | + [[ -n ceph_version: reef ]] 2026-01-06 00:14:55.933684 | orchestrator | + sed -i 's/ceph_version: .*/ceph_version: reef/g' /opt/configuration/environments/manager/configuration.yml 2026-01-06 00:14:55.939529 | orchestrator | + /opt/configuration/scripts/set-openstack-version.sh 2025.1 2026-01-06 00:14:55.945625 | orchestrator | + set -e 2026-01-06 00:14:55.945687 | orchestrator | + VERSION=2025.1 2026-01-06 00:14:55.945994 | orchestrator | ++ grep '^openstack_version:' /opt/configuration/environments/manager/configuration.yml 2026-01-06 00:14:55.949439 | orchestrator | + [[ -n openstack_version: 2024.2 ]] 2026-01-06 00:14:55.949491 | orchestrator | + sed -i 's/openstack_version: .*/openstack_version: 2025.1/g' /opt/configuration/environments/manager/configuration.yml 2026-01-06 00:14:55.953534 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2026-01-06 00:14:55.954403 | orchestrator | ++ semver latest 7.0.0 2026-01-06 00:14:56.020249 | orchestrator | + [[ -1 -ge 0 ]] 2026-01-06 00:14:56.020321 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2026-01-06 00:14:56.020331 | orchestrator | + echo 'enable_osism_kubernetes: true' 2026-01-06 00:14:56.021730 | orchestrator | ++ semver latest 10.0.0-0 2026-01-06 00:14:56.089430 | orchestrator | + [[ -1 -ge 0 ]] 2026-01-06 00:14:56.090179 | orchestrator | ++ semver 2025.1 2025.1 2026-01-06 00:14:56.184568 | orchestrator | + [[ 0 -ge 0 ]] 2026-01-06 00:14:56.184632 | orchestrator | + sed -i '/^om_enable_rabbitmq_high_availability:/d' /opt/configuration/environments/kolla/configuration.yml 2026-01-06 00:14:56.192269 | orchestrator | + sed -i '/^om_enable_rabbitmq_quorum_queues:/d' /opt/configuration/environments/kolla/configuration.yml 2026-01-06 00:14:56.195646 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2026-01-06 00:14:56.283577 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-01-06 00:14:56.285422 | orchestrator | + source /opt/venv/bin/activate 2026-01-06 00:14:56.286430 | orchestrator | ++ deactivate nondestructive 2026-01-06 00:14:56.286465 | orchestrator | ++ '[' -n '' ']' 2026-01-06 00:14:56.286477 | orchestrator | ++ '[' -n '' ']' 2026-01-06 00:14:56.286494 | orchestrator | ++ hash -r 2026-01-06 00:14:56.286506 | orchestrator | ++ '[' -n '' ']' 2026-01-06 00:14:56.286517 | orchestrator | ++ unset VIRTUAL_ENV 2026-01-06 00:14:56.286528 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2026-01-06 00:14:56.286540 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2026-01-06 00:14:56.286903 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2026-01-06 00:14:56.286921 | orchestrator | ++ '[' linux-gnu = msys ']' 2026-01-06 00:14:56.286932 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2026-01-06 00:14:56.286944 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2026-01-06 00:14:56.286956 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-01-06 00:14:56.286987 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-01-06 00:14:56.287004 | orchestrator | ++ export PATH 2026-01-06 00:14:56.287016 | orchestrator | ++ '[' -n '' ']' 2026-01-06 00:14:56.287133 | orchestrator | ++ '[' -z '' ']' 2026-01-06 00:14:56.287148 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2026-01-06 00:14:56.287163 | orchestrator | ++ PS1='(venv) ' 2026-01-06 00:14:56.287175 | orchestrator | ++ export PS1 2026-01-06 00:14:56.287187 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2026-01-06 00:14:56.287198 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2026-01-06 00:14:56.287209 | orchestrator | ++ hash -r 2026-01-06 00:14:56.287450 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2026-01-06 00:14:57.530601 | orchestrator | 2026-01-06 00:14:57.530725 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2026-01-06 00:14:57.530743 | orchestrator | 2026-01-06 00:14:57.530755 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-01-06 00:14:58.102126 | orchestrator | ok: [testbed-manager] 2026-01-06 00:14:58.102241 | orchestrator | 2026-01-06 00:14:58.102259 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-01-06 00:14:59.101830 | orchestrator | changed: [testbed-manager] 2026-01-06 00:14:59.101945 | orchestrator | 2026-01-06 00:14:59.101966 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2026-01-06 00:14:59.101982 | orchestrator | 2026-01-06 00:14:59.101996 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:15:01.507049 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:01.507159 | orchestrator | 2026-01-06 00:15:01.507177 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2026-01-06 00:15:01.564607 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:01.564772 | orchestrator | 2026-01-06 00:15:01.564803 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2026-01-06 00:15:02.053113 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:02.053218 | orchestrator | 2026-01-06 00:15:02.053235 | orchestrator | TASK [Add netbox_enable parameter] ********************************************* 2026-01-06 00:15:02.092448 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:02.092558 | orchestrator | 2026-01-06 00:15:02.092575 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-01-06 00:15:02.458447 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:02.458614 | orchestrator | 2026-01-06 00:15:02.458645 | orchestrator | TASK [Use insecure glance configuration] *************************************** 2026-01-06 00:15:02.523369 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:02.523464 | orchestrator | 2026-01-06 00:15:02.523480 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2026-01-06 00:15:02.869235 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:02.869347 | orchestrator | 2026-01-06 00:15:02.869393 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2026-01-06 00:15:03.002530 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:03.002595 | orchestrator | 2026-01-06 00:15:03.002601 | orchestrator | PLAY [Apply role traefik] ****************************************************** 2026-01-06 00:15:03.002607 | orchestrator | 2026-01-06 00:15:03.002611 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:15:05.774295 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:05.774408 | orchestrator | 2026-01-06 00:15:05.774426 | orchestrator | TASK [Apply traefik role] ****************************************************** 2026-01-06 00:15:05.869538 | orchestrator | included: osism.services.traefik for testbed-manager 2026-01-06 00:15:05.869650 | orchestrator | 2026-01-06 00:15:05.869667 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2026-01-06 00:15:05.924743 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2026-01-06 00:15:05.924850 | orchestrator | 2026-01-06 00:15:05.924864 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2026-01-06 00:15:07.080908 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2026-01-06 00:15:07.081005 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2026-01-06 00:15:07.081020 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2026-01-06 00:15:07.081032 | orchestrator | 2026-01-06 00:15:07.081045 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2026-01-06 00:15:08.899478 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2026-01-06 00:15:08.899580 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2026-01-06 00:15:08.899595 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2026-01-06 00:15:08.899608 | orchestrator | 2026-01-06 00:15:08.899620 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2026-01-06 00:15:09.549596 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:15:09.549679 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:09.549693 | orchestrator | 2026-01-06 00:15:09.549726 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2026-01-06 00:15:10.269328 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:15:10.269415 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:10.269431 | orchestrator | 2026-01-06 00:15:10.269443 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2026-01-06 00:15:10.331651 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:10.331806 | orchestrator | 2026-01-06 00:15:10.331826 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2026-01-06 00:15:10.701887 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:10.702004 | orchestrator | 2026-01-06 00:15:10.702087 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2026-01-06 00:15:10.776529 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2026-01-06 00:15:10.776656 | orchestrator | 2026-01-06 00:15:10.776680 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2026-01-06 00:15:11.913496 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:11.913617 | orchestrator | 2026-01-06 00:15:11.913636 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2026-01-06 00:15:12.751088 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:12.751201 | orchestrator | 2026-01-06 00:15:12.751218 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2026-01-06 00:15:27.775190 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:27.775316 | orchestrator | 2026-01-06 00:15:27.775335 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2026-01-06 00:15:27.836115 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:27.836204 | orchestrator | 2026-01-06 00:15:27.836211 | orchestrator | PLAY [Deploy manager service] ************************************************** 2026-01-06 00:15:27.836216 | orchestrator | 2026-01-06 00:15:27.836221 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:15:29.667407 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:29.667532 | orchestrator | 2026-01-06 00:15:29.667550 | orchestrator | TASK [Apply manager role] ****************************************************** 2026-01-06 00:15:29.781964 | orchestrator | included: osism.services.manager for testbed-manager 2026-01-06 00:15:29.782138 | orchestrator | 2026-01-06 00:15:29.782155 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2026-01-06 00:15:29.844276 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:15:29.844381 | orchestrator | 2026-01-06 00:15:29.844398 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2026-01-06 00:15:32.584868 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:32.584992 | orchestrator | 2026-01-06 00:15:32.585010 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2026-01-06 00:15:32.641157 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:32.641263 | orchestrator | 2026-01-06 00:15:32.641278 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2026-01-06 00:15:32.786440 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2026-01-06 00:15:32.786543 | orchestrator | 2026-01-06 00:15:32.786558 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2026-01-06 00:15:35.626431 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2026-01-06 00:15:35.626561 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2026-01-06 00:15:35.626587 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2026-01-06 00:15:35.626605 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2026-01-06 00:15:35.626621 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2026-01-06 00:15:35.626638 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2026-01-06 00:15:35.626655 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2026-01-06 00:15:35.626672 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2026-01-06 00:15:35.626688 | orchestrator | 2026-01-06 00:15:35.626706 | orchestrator | TASK [osism.services.manager : Copy all environment file] ********************** 2026-01-06 00:15:36.261408 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:36.261509 | orchestrator | 2026-01-06 00:15:36.261525 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2026-01-06 00:15:36.967087 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:36.967182 | orchestrator | 2026-01-06 00:15:36.967191 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2026-01-06 00:15:37.048488 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2026-01-06 00:15:37.048594 | orchestrator | 2026-01-06 00:15:37.048609 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2026-01-06 00:15:38.290387 | orchestrator | changed: [testbed-manager] => (item=ara) 2026-01-06 00:15:38.290523 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2026-01-06 00:15:38.290592 | orchestrator | 2026-01-06 00:15:38.290633 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2026-01-06 00:15:38.922892 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:38.922998 | orchestrator | 2026-01-06 00:15:38.923015 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2026-01-06 00:15:38.978855 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:38.978968 | orchestrator | 2026-01-06 00:15:38.978985 | orchestrator | TASK [osism.services.manager : Include frontend config tasks] ****************** 2026-01-06 00:15:39.073883 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-frontend.yml for testbed-manager 2026-01-06 00:15:39.074067 | orchestrator | 2026-01-06 00:15:39.074088 | orchestrator | TASK [osism.services.manager : Copy frontend environment file] ***************** 2026-01-06 00:15:39.739214 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:39.739322 | orchestrator | 2026-01-06 00:15:39.739338 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2026-01-06 00:15:39.804090 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2026-01-06 00:15:39.804182 | orchestrator | 2026-01-06 00:15:39.804196 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2026-01-06 00:15:41.212024 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:15:41.212149 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:15:41.212166 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:41.212181 | orchestrator | 2026-01-06 00:15:41.212194 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2026-01-06 00:15:41.815057 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:41.815170 | orchestrator | 2026-01-06 00:15:41.815188 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2026-01-06 00:15:41.879127 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:41.879235 | orchestrator | 2026-01-06 00:15:41.879279 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2026-01-06 00:15:41.974848 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2026-01-06 00:15:41.974922 | orchestrator | 2026-01-06 00:15:41.974928 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2026-01-06 00:15:42.487351 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:42.487458 | orchestrator | 2026-01-06 00:15:42.487475 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2026-01-06 00:15:42.906885 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:42.907001 | orchestrator | 2026-01-06 00:15:42.907017 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2026-01-06 00:15:44.118750 | orchestrator | changed: [testbed-manager] => (item=conductor) 2026-01-06 00:15:44.118870 | orchestrator | changed: [testbed-manager] => (item=openstack) 2026-01-06 00:15:44.118885 | orchestrator | 2026-01-06 00:15:44.118899 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2026-01-06 00:15:44.793934 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:44.794125 | orchestrator | 2026-01-06 00:15:44.794156 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2026-01-06 00:15:45.203185 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:45.203288 | orchestrator | 2026-01-06 00:15:45.203304 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2026-01-06 00:15:45.558172 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:45.558312 | orchestrator | 2026-01-06 00:15:45.558330 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2026-01-06 00:15:45.595577 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:45.595699 | orchestrator | 2026-01-06 00:15:45.595715 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2026-01-06 00:15:45.658424 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2026-01-06 00:15:45.658534 | orchestrator | 2026-01-06 00:15:45.658551 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2026-01-06 00:15:45.713877 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:45.713979 | orchestrator | 2026-01-06 00:15:45.713995 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2026-01-06 00:15:47.810164 | orchestrator | changed: [testbed-manager] => (item=osism) 2026-01-06 00:15:47.810277 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2026-01-06 00:15:47.810293 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2026-01-06 00:15:47.810305 | orchestrator | 2026-01-06 00:15:47.810319 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2026-01-06 00:15:48.547455 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:48.547594 | orchestrator | 2026-01-06 00:15:48.547612 | orchestrator | TASK [osism.services.manager : Copy hubble wrapper script] ********************* 2026-01-06 00:15:49.302576 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:49.302692 | orchestrator | 2026-01-06 00:15:49.302710 | orchestrator | TASK [osism.services.manager : Copy flux wrapper script] *********************** 2026-01-06 00:15:50.031904 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:50.032011 | orchestrator | 2026-01-06 00:15:50.032029 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2026-01-06 00:15:50.110633 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2026-01-06 00:15:50.110790 | orchestrator | 2026-01-06 00:15:50.110807 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2026-01-06 00:15:50.153584 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:50.153700 | orchestrator | 2026-01-06 00:15:50.153718 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2026-01-06 00:15:50.893704 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2026-01-06 00:15:50.893830 | orchestrator | 2026-01-06 00:15:50.893845 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2026-01-06 00:15:50.979757 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2026-01-06 00:15:50.979860 | orchestrator | 2026-01-06 00:15:50.979874 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2026-01-06 00:15:51.721166 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:51.721270 | orchestrator | 2026-01-06 00:15:51.721287 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2026-01-06 00:15:52.341837 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:52.341946 | orchestrator | 2026-01-06 00:15:52.341964 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2026-01-06 00:15:52.403506 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:15:52.403628 | orchestrator | 2026-01-06 00:15:52.403643 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2026-01-06 00:15:52.458300 | orchestrator | ok: [testbed-manager] 2026-01-06 00:15:52.458398 | orchestrator | 2026-01-06 00:15:52.458413 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2026-01-06 00:15:53.328217 | orchestrator | changed: [testbed-manager] 2026-01-06 00:15:53.328326 | orchestrator | 2026-01-06 00:15:53.328344 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2026-01-06 00:17:03.548041 | orchestrator | changed: [testbed-manager] 2026-01-06 00:17:03.548161 | orchestrator | 2026-01-06 00:17:03.548178 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2026-01-06 00:17:04.568373 | orchestrator | ok: [testbed-manager] 2026-01-06 00:17:04.568499 | orchestrator | 2026-01-06 00:17:04.568541 | orchestrator | TASK [osism.services.manager : Do a manual start of the manager service] ******* 2026-01-06 00:17:04.621716 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:17:04.621881 | orchestrator | 2026-01-06 00:17:04.621899 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2026-01-06 00:17:06.933239 | orchestrator | changed: [testbed-manager] 2026-01-06 00:17:06.933368 | orchestrator | 2026-01-06 00:17:06.933386 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2026-01-06 00:17:06.999598 | orchestrator | ok: [testbed-manager] 2026-01-06 00:17:06.999696 | orchestrator | 2026-01-06 00:17:06.999710 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-01-06 00:17:06.999723 | orchestrator | 2026-01-06 00:17:06.999734 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2026-01-06 00:17:07.063003 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:17:07.063092 | orchestrator | 2026-01-06 00:17:07.063106 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2026-01-06 00:18:07.125649 | orchestrator | Pausing for 60 seconds 2026-01-06 00:18:07.125786 | orchestrator | changed: [testbed-manager] 2026-01-06 00:18:07.125830 | orchestrator | 2026-01-06 00:18:07.125845 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2026-01-06 00:18:10.219298 | orchestrator | changed: [testbed-manager] 2026-01-06 00:18:10.219403 | orchestrator | 2026-01-06 00:18:10.219420 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2026-01-06 00:19:12.262713 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2026-01-06 00:19:12.262903 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2026-01-06 00:19:12.262920 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (48 retries left). 2026-01-06 00:19:12.262974 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:12.262990 | orchestrator | 2026-01-06 00:19:12.263003 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2026-01-06 00:19:22.924289 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:22.924427 | orchestrator | 2026-01-06 00:19:22.924445 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2026-01-06 00:19:23.010767 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2026-01-06 00:19:23.010932 | orchestrator | 2026-01-06 00:19:23.010958 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-01-06 00:19:23.010977 | orchestrator | 2026-01-06 00:19:23.010995 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2026-01-06 00:19:23.079664 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:19:23.079764 | orchestrator | 2026-01-06 00:19:23.079778 | orchestrator | TASK [osism.services.manager : Include version verification tasks] ************* 2026-01-06 00:19:23.159177 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/verify-versions.yml for testbed-manager 2026-01-06 00:19:23.159315 | orchestrator | 2026-01-06 00:19:23.159338 | orchestrator | TASK [osism.services.manager : Deploy service manager version check script] **** 2026-01-06 00:19:23.994952 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:24.315337 | orchestrator | 2026-01-06 00:19:24.315439 | orchestrator | TASK [osism.services.manager : Execute service manager version check] ********** 2026-01-06 00:19:27.469019 | orchestrator | ok: [testbed-manager] 2026-01-06 00:19:27.469127 | orchestrator | 2026-01-06 00:19:27.469145 | orchestrator | TASK [osism.services.manager : Display version check results] ****************** 2026-01-06 00:19:27.557358 | orchestrator | ok: [testbed-manager] => { 2026-01-06 00:19:27.557458 | orchestrator | "version_check_result.stdout_lines": [ 2026-01-06 00:19:27.557474 | orchestrator | "=== OSISM Container Version Check ===", 2026-01-06 00:19:27.557485 | orchestrator | "Checking running containers against expected versions...", 2026-01-06 00:19:27.557496 | orchestrator | "", 2026-01-06 00:19:27.557506 | orchestrator | "Checking service: inventory_reconciler (Inventory Reconciler Service)", 2026-01-06 00:19:27.557516 | orchestrator | " Expected: registry.osism.tech/osism/inventory-reconciler:latest", 2026-01-06 00:19:27.557525 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557535 | orchestrator | " Running: registry.osism.tech/osism/inventory-reconciler:latest", 2026-01-06 00:19:27.557544 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557553 | orchestrator | "", 2026-01-06 00:19:27.557563 | orchestrator | "Checking service: osism-ansible (OSISM Ansible Service)", 2026-01-06 00:19:27.557572 | orchestrator | " Expected: registry.osism.tech/osism/osism-ansible:latest", 2026-01-06 00:19:27.557580 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557589 | orchestrator | " Running: registry.osism.tech/osism/osism-ansible:latest", 2026-01-06 00:19:27.557598 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557607 | orchestrator | "", 2026-01-06 00:19:27.557616 | orchestrator | "Checking service: osism-kubernetes (Osism-Kubernetes Service)", 2026-01-06 00:19:27.557624 | orchestrator | " Expected: registry.osism.tech/osism/osism-kubernetes:latest", 2026-01-06 00:19:27.557634 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557643 | orchestrator | " Running: registry.osism.tech/osism/osism-kubernetes:latest", 2026-01-06 00:19:27.557652 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557660 | orchestrator | "", 2026-01-06 00:19:27.557669 | orchestrator | "Checking service: ceph-ansible (Ceph-Ansible Service)", 2026-01-06 00:19:27.557700 | orchestrator | " Expected: registry.osism.tech/osism/ceph-ansible:reef", 2026-01-06 00:19:27.557709 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557718 | orchestrator | " Running: registry.osism.tech/osism/ceph-ansible:reef", 2026-01-06 00:19:27.557727 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557735 | orchestrator | "", 2026-01-06 00:19:27.557744 | orchestrator | "Checking service: kolla-ansible (Kolla-Ansible Service)", 2026-01-06 00:19:27.557753 | orchestrator | " Expected: registry.osism.tech/osism/kolla-ansible:2025.1", 2026-01-06 00:19:27.557761 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557770 | orchestrator | " Running: registry.osism.tech/osism/kolla-ansible:2025.1", 2026-01-06 00:19:27.557778 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557787 | orchestrator | "", 2026-01-06 00:19:27.557796 | orchestrator | "Checking service: osismclient (OSISM Client)", 2026-01-06 00:19:27.557805 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.557814 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557822 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.557831 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557840 | orchestrator | "", 2026-01-06 00:19:27.557848 | orchestrator | "Checking service: ara-server (ARA Server)", 2026-01-06 00:19:27.557857 | orchestrator | " Expected: registry.osism.tech/osism/ara-server:1.7.3", 2026-01-06 00:19:27.557866 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557926 | orchestrator | " Running: registry.osism.tech/osism/ara-server:1.7.3", 2026-01-06 00:19:27.557937 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.557947 | orchestrator | "", 2026-01-06 00:19:27.557957 | orchestrator | "Checking service: mariadb (MariaDB for ARA)", 2026-01-06 00:19:27.557967 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-01-06 00:19:27.557982 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.557992 | orchestrator | " Running: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-01-06 00:19:27.558003 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558013 | orchestrator | "", 2026-01-06 00:19:27.558067 | orchestrator | "Checking service: frontend (OSISM Frontend)", 2026-01-06 00:19:27.558076 | orchestrator | " Expected: registry.osism.tech/osism/osism-frontend:latest", 2026-01-06 00:19:27.558085 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558094 | orchestrator | " Running: registry.osism.tech/osism/osism-frontend:latest", 2026-01-06 00:19:27.558103 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558111 | orchestrator | "", 2026-01-06 00:19:27.558120 | orchestrator | "Checking service: redis (Redis Cache)", 2026-01-06 00:19:27.558129 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-01-06 00:19:27.558138 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558147 | orchestrator | " Running: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-01-06 00:19:27.558155 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558164 | orchestrator | "", 2026-01-06 00:19:27.558173 | orchestrator | "Checking service: api (OSISM API Service)", 2026-01-06 00:19:27.558181 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558190 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558199 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558207 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558216 | orchestrator | "", 2026-01-06 00:19:27.558225 | orchestrator | "Checking service: listener (OpenStack Event Listener)", 2026-01-06 00:19:27.558233 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558242 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558251 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558259 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558268 | orchestrator | "", 2026-01-06 00:19:27.558276 | orchestrator | "Checking service: openstack (OpenStack Integration)", 2026-01-06 00:19:27.558285 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558383 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558393 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558402 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558410 | orchestrator | "", 2026-01-06 00:19:27.558425 | orchestrator | "Checking service: beat (Celery Beat Scheduler)", 2026-01-06 00:19:27.558440 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558453 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558465 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558474 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558483 | orchestrator | "", 2026-01-06 00:19:27.558491 | orchestrator | "Checking service: flower (Celery Flower Monitor)", 2026-01-06 00:19:27.558520 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558529 | orchestrator | " Enabled: true", 2026-01-06 00:19:27.558538 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2026-01-06 00:19:27.558547 | orchestrator | " Status: ✅ MATCH", 2026-01-06 00:19:27.558555 | orchestrator | "", 2026-01-06 00:19:27.558564 | orchestrator | "=== Summary ===", 2026-01-06 00:19:27.558572 | orchestrator | "Errors (version mismatches): 0", 2026-01-06 00:19:27.558581 | orchestrator | "Warnings (expected containers not running): 0", 2026-01-06 00:19:27.558590 | orchestrator | "", 2026-01-06 00:19:27.558598 | orchestrator | "✅ All running containers match expected versions!" 2026-01-06 00:19:27.558607 | orchestrator | ] 2026-01-06 00:19:27.558616 | orchestrator | } 2026-01-06 00:19:27.558625 | orchestrator | 2026-01-06 00:19:27.558634 | orchestrator | TASK [osism.services.manager : Skip version check due to service configuration] *** 2026-01-06 00:19:27.608993 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:19:27.609092 | orchestrator | 2026-01-06 00:19:27.609108 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:19:27.609122 | orchestrator | testbed-manager : ok=70 changed=37 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2026-01-06 00:19:27.609134 | orchestrator | 2026-01-06 00:19:27.738965 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-01-06 00:19:27.739061 | orchestrator | + deactivate 2026-01-06 00:19:27.739077 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2026-01-06 00:19:27.739090 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-01-06 00:19:27.739100 | orchestrator | + export PATH 2026-01-06 00:19:27.739110 | orchestrator | + unset _OLD_VIRTUAL_PATH 2026-01-06 00:19:27.739120 | orchestrator | + '[' -n '' ']' 2026-01-06 00:19:27.739130 | orchestrator | + hash -r 2026-01-06 00:19:27.739140 | orchestrator | + '[' -n '' ']' 2026-01-06 00:19:27.739150 | orchestrator | + unset VIRTUAL_ENV 2026-01-06 00:19:27.739160 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2026-01-06 00:19:27.739169 | orchestrator | + '[' '!' '' = nondestructive ']' 2026-01-06 00:19:27.739179 | orchestrator | + unset -f deactivate 2026-01-06 00:19:27.739189 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2026-01-06 00:19:27.745121 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-01-06 00:19:27.745158 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-01-06 00:19:27.745169 | orchestrator | + local max_attempts=60 2026-01-06 00:19:27.745181 | orchestrator | + local name=ceph-ansible 2026-01-06 00:19:27.745192 | orchestrator | + local attempt_num=1 2026-01-06 00:19:27.746433 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:19:27.779458 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:19:27.779541 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-01-06 00:19:27.779551 | orchestrator | + local max_attempts=60 2026-01-06 00:19:27.779561 | orchestrator | + local name=kolla-ansible 2026-01-06 00:19:27.779570 | orchestrator | + local attempt_num=1 2026-01-06 00:19:27.780783 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-01-06 00:19:27.824406 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:19:27.824487 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-01-06 00:19:27.824496 | orchestrator | + local max_attempts=60 2026-01-06 00:19:27.824504 | orchestrator | + local name=osism-ansible 2026-01-06 00:19:27.824511 | orchestrator | + local attempt_num=1 2026-01-06 00:19:27.825347 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-01-06 00:19:27.869823 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:19:27.869963 | orchestrator | + [[ true == \t\r\u\e ]] 2026-01-06 00:19:27.869992 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-01-06 00:19:28.591764 | orchestrator | + docker compose --project-directory /opt/manager ps 2026-01-06 00:19:28.767842 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2026-01-06 00:19:28.768003 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:reef "/entrypoint.sh osis…" ceph-ansible 2 minutes ago Up About a minute (healthy) 2026-01-06 00:19:28.768020 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:2025.1 "/entrypoint.sh osis…" kolla-ansible 2 minutes ago Up About a minute (healthy) 2026-01-06 00:19:28.768033 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" api 2 minutes ago Up 2 minutes (healthy) 192.168.16.5:8000->8000/tcp 2026-01-06 00:19:28.768047 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.3 "sh -c '/wait && /ru…" ara-server 2 minutes ago Up 2 minutes (healthy) 8000/tcp 2026-01-06 00:19:28.768059 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" beat 2 minutes ago Up 2 minutes (healthy) 2026-01-06 00:19:28.768092 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" flower 2 minutes ago Up 2 minutes (healthy) 2026-01-06 00:19:28.768104 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:latest "/sbin/tini -- /entr…" inventory_reconciler 2 minutes ago Up About a minute (healthy) 2026-01-06 00:19:28.768115 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" listener 2 minutes ago Up 2 minutes (healthy) 2026-01-06 00:19:28.768126 | orchestrator | manager-mariadb-1 registry.osism.tech/dockerhub/library/mariadb:11.8.4 "docker-entrypoint.s…" mariadb 2 minutes ago Up 2 minutes (healthy) 3306/tcp 2026-01-06 00:19:28.768137 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" openstack 2 minutes ago Up 2 minutes (healthy) 2026-01-06 00:19:28.768148 | orchestrator | manager-redis-1 registry.osism.tech/dockerhub/library/redis:7.4.7-alpine "docker-entrypoint.s…" redis 2 minutes ago Up 2 minutes (healthy) 6379/tcp 2026-01-06 00:19:28.768160 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:latest "/entrypoint.sh osis…" osism-ansible 2 minutes ago Up About a minute (healthy) 2026-01-06 00:19:28.768171 | orchestrator | osism-frontend registry.osism.tech/osism/osism-frontend:latest "docker-entrypoint.s…" frontend 2 minutes ago Up 2 minutes 192.168.16.5:3000->3000/tcp 2026-01-06 00:19:28.768182 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:latest "/entrypoint.sh osis…" osism-kubernetes 2 minutes ago Up About a minute (healthy) 2026-01-06 00:19:28.768193 | orchestrator | osismclient registry.osism.tech/osism/osism:latest "/sbin/tini -- sleep…" osismclient 2 minutes ago Up 2 minutes (healthy) 2026-01-06 00:19:28.773333 | orchestrator | ++ semver latest 7.0.0 2026-01-06 00:19:28.823112 | orchestrator | + [[ -1 -ge 0 ]] 2026-01-06 00:19:28.823224 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2026-01-06 00:19:28.823275 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2026-01-06 00:19:28.828429 | orchestrator | + osism apply resolvconf -l testbed-manager 2026-01-06 00:19:41.111811 | orchestrator | 2026-01-06 00:19:41 | INFO  | Task f0d296b5-3e17-4679-99e2-0e2f62668f91 (resolvconf) was prepared for execution. 2026-01-06 00:19:41.111925 | orchestrator | 2026-01-06 00:19:41 | INFO  | It takes a moment until task f0d296b5-3e17-4679-99e2-0e2f62668f91 (resolvconf) has been started and output is visible here. 2026-01-06 00:19:55.551351 | orchestrator | 2026-01-06 00:19:55.551433 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2026-01-06 00:19:55.551441 | orchestrator | 2026-01-06 00:19:55.551445 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:19:55.551450 | orchestrator | Tuesday 06 January 2026 00:19:45 +0000 (0:00:00.137) 0:00:00.137 ******* 2026-01-06 00:19:55.551455 | orchestrator | ok: [testbed-manager] 2026-01-06 00:19:55.551460 | orchestrator | 2026-01-06 00:19:55.551465 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-01-06 00:19:55.551471 | orchestrator | Tuesday 06 January 2026 00:19:49 +0000 (0:00:03.761) 0:00:03.899 ******* 2026-01-06 00:19:55.551475 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:19:55.551479 | orchestrator | 2026-01-06 00:19:55.551483 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-01-06 00:19:55.551487 | orchestrator | Tuesday 06 January 2026 00:19:49 +0000 (0:00:00.069) 0:00:03.968 ******* 2026-01-06 00:19:55.551492 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2026-01-06 00:19:55.551497 | orchestrator | 2026-01-06 00:19:55.551509 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-01-06 00:19:55.551513 | orchestrator | Tuesday 06 January 2026 00:19:49 +0000 (0:00:00.080) 0:00:04.049 ******* 2026-01-06 00:19:55.551517 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:19:55.551521 | orchestrator | 2026-01-06 00:19:55.551526 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-01-06 00:19:55.551530 | orchestrator | Tuesday 06 January 2026 00:19:49 +0000 (0:00:00.065) 0:00:04.114 ******* 2026-01-06 00:19:55.551534 | orchestrator | ok: [testbed-manager] 2026-01-06 00:19:55.551538 | orchestrator | 2026-01-06 00:19:55.551542 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-01-06 00:19:55.551546 | orchestrator | Tuesday 06 January 2026 00:19:50 +0000 (0:00:01.136) 0:00:05.251 ******* 2026-01-06 00:19:55.551550 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:19:55.551554 | orchestrator | 2026-01-06 00:19:55.551557 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-01-06 00:19:55.551561 | orchestrator | Tuesday 06 January 2026 00:19:50 +0000 (0:00:00.069) 0:00:05.320 ******* 2026-01-06 00:19:55.551565 | orchestrator | ok: [testbed-manager] 2026-01-06 00:19:55.551569 | orchestrator | 2026-01-06 00:19:55.551573 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-01-06 00:19:55.551577 | orchestrator | Tuesday 06 January 2026 00:19:51 +0000 (0:00:00.557) 0:00:05.878 ******* 2026-01-06 00:19:55.551581 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:19:55.551585 | orchestrator | 2026-01-06 00:19:55.551589 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-01-06 00:19:55.551594 | orchestrator | Tuesday 06 January 2026 00:19:51 +0000 (0:00:00.093) 0:00:05.972 ******* 2026-01-06 00:19:55.551598 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:55.551602 | orchestrator | 2026-01-06 00:19:55.551606 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-01-06 00:19:55.551610 | orchestrator | Tuesday 06 January 2026 00:19:51 +0000 (0:00:00.561) 0:00:06.533 ******* 2026-01-06 00:19:55.551614 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:55.551633 | orchestrator | 2026-01-06 00:19:55.551637 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-01-06 00:19:55.551641 | orchestrator | Tuesday 06 January 2026 00:19:52 +0000 (0:00:01.169) 0:00:07.702 ******* 2026-01-06 00:19:55.551645 | orchestrator | ok: [testbed-manager] 2026-01-06 00:19:55.551649 | orchestrator | 2026-01-06 00:19:55.551652 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-01-06 00:19:55.551656 | orchestrator | Tuesday 06 January 2026 00:19:53 +0000 (0:00:01.043) 0:00:08.746 ******* 2026-01-06 00:19:55.551660 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2026-01-06 00:19:55.551664 | orchestrator | 2026-01-06 00:19:55.551668 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-01-06 00:19:55.551672 | orchestrator | Tuesday 06 January 2026 00:19:54 +0000 (0:00:00.088) 0:00:08.834 ******* 2026-01-06 00:19:55.551676 | orchestrator | changed: [testbed-manager] 2026-01-06 00:19:55.551680 | orchestrator | 2026-01-06 00:19:55.551684 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:19:55.551689 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-01-06 00:19:55.551693 | orchestrator | 2026-01-06 00:19:55.551697 | orchestrator | 2026-01-06 00:19:55.551700 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:19:55.551704 | orchestrator | Tuesday 06 January 2026 00:19:55 +0000 (0:00:01.202) 0:00:10.037 ******* 2026-01-06 00:19:55.551708 | orchestrator | =============================================================================== 2026-01-06 00:19:55.551712 | orchestrator | Gathering Facts --------------------------------------------------------- 3.76s 2026-01-06 00:19:55.551716 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.20s 2026-01-06 00:19:55.551720 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 1.17s 2026-01-06 00:19:55.551724 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.14s 2026-01-06 00:19:55.551728 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 1.04s 2026-01-06 00:19:55.551731 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.56s 2026-01-06 00:19:55.551743 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.56s 2026-01-06 00:19:55.551747 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.09s 2026-01-06 00:19:55.551751 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.09s 2026-01-06 00:19:55.551755 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.08s 2026-01-06 00:19:55.551762 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.07s 2026-01-06 00:19:55.551766 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.07s 2026-01-06 00:19:55.551770 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.07s 2026-01-06 00:19:55.946233 | orchestrator | + osism apply sshconfig 2026-01-06 00:20:08.325338 | orchestrator | 2026-01-06 00:20:08 | INFO  | Task 321f334f-1331-44f7-96f1-a9dd7c223b56 (sshconfig) was prepared for execution. 2026-01-06 00:20:08.325468 | orchestrator | 2026-01-06 00:20:08 | INFO  | It takes a moment until task 321f334f-1331-44f7-96f1-a9dd7c223b56 (sshconfig) has been started and output is visible here. 2026-01-06 00:20:20.659397 | orchestrator | 2026-01-06 00:20:20.659508 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2026-01-06 00:20:20.659519 | orchestrator | 2026-01-06 00:20:20.659527 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2026-01-06 00:20:20.659534 | orchestrator | Tuesday 06 January 2026 00:20:12 +0000 (0:00:00.162) 0:00:00.162 ******* 2026-01-06 00:20:20.659565 | orchestrator | ok: [testbed-manager] 2026-01-06 00:20:20.659572 | orchestrator | 2026-01-06 00:20:20.659579 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2026-01-06 00:20:20.659586 | orchestrator | Tuesday 06 January 2026 00:20:13 +0000 (0:00:00.606) 0:00:00.768 ******* 2026-01-06 00:20:20.659592 | orchestrator | changed: [testbed-manager] 2026-01-06 00:20:20.659599 | orchestrator | 2026-01-06 00:20:20.659605 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2026-01-06 00:20:20.659612 | orchestrator | Tuesday 06 January 2026 00:20:14 +0000 (0:00:00.521) 0:00:01.290 ******* 2026-01-06 00:20:20.659618 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2026-01-06 00:20:20.659625 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2026-01-06 00:20:20.659631 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2026-01-06 00:20:20.659638 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2026-01-06 00:20:20.659644 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2026-01-06 00:20:20.659650 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2026-01-06 00:20:20.659657 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2026-01-06 00:20:20.659663 | orchestrator | 2026-01-06 00:20:20.659669 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2026-01-06 00:20:20.659675 | orchestrator | Tuesday 06 January 2026 00:20:19 +0000 (0:00:05.735) 0:00:07.025 ******* 2026-01-06 00:20:20.659681 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:20:20.659687 | orchestrator | 2026-01-06 00:20:20.659694 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2026-01-06 00:20:20.659701 | orchestrator | Tuesday 06 January 2026 00:20:19 +0000 (0:00:00.081) 0:00:07.106 ******* 2026-01-06 00:20:20.659707 | orchestrator | changed: [testbed-manager] 2026-01-06 00:20:20.659713 | orchestrator | 2026-01-06 00:20:20.659719 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:20:20.659727 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:20:20.659734 | orchestrator | 2026-01-06 00:20:20.659741 | orchestrator | 2026-01-06 00:20:20.659747 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:20:20.659753 | orchestrator | Tuesday 06 January 2026 00:20:20 +0000 (0:00:00.582) 0:00:07.689 ******* 2026-01-06 00:20:20.659760 | orchestrator | =============================================================================== 2026-01-06 00:20:20.659766 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 5.74s 2026-01-06 00:20:20.659772 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.61s 2026-01-06 00:20:20.659779 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.58s 2026-01-06 00:20:20.659785 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.52s 2026-01-06 00:20:20.659791 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.08s 2026-01-06 00:20:20.947960 | orchestrator | + osism apply known-hosts 2026-01-06 00:20:33.005871 | orchestrator | 2026-01-06 00:20:33 | INFO  | Task 40142451-6987-4445-a48e-522a965bab0f (known-hosts) was prepared for execution. 2026-01-06 00:20:33.006138 | orchestrator | 2026-01-06 00:20:33 | INFO  | It takes a moment until task 40142451-6987-4445-a48e-522a965bab0f (known-hosts) has been started and output is visible here. 2026-01-06 00:20:49.699131 | orchestrator | 2026-01-06 00:20:49.699241 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2026-01-06 00:20:49.699257 | orchestrator | 2026-01-06 00:20:49.699269 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2026-01-06 00:20:49.699282 | orchestrator | Tuesday 06 January 2026 00:20:37 +0000 (0:00:00.166) 0:00:00.166 ******* 2026-01-06 00:20:49.699294 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-01-06 00:20:49.699324 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-01-06 00:20:49.699335 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-01-06 00:20:49.699346 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-01-06 00:20:49.699357 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-01-06 00:20:49.699372 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-01-06 00:20:49.699390 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-01-06 00:20:49.699409 | orchestrator | 2026-01-06 00:20:49.699442 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2026-01-06 00:20:49.699463 | orchestrator | Tuesday 06 January 2026 00:20:43 +0000 (0:00:06.029) 0:00:06.195 ******* 2026-01-06 00:20:49.699483 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-01-06 00:20:49.699498 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-01-06 00:20:49.699509 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-01-06 00:20:49.699519 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-01-06 00:20:49.699530 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-01-06 00:20:49.699541 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-01-06 00:20:49.699552 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-01-06 00:20:49.699563 | orchestrator | 2026-01-06 00:20:49.699574 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.699584 | orchestrator | Tuesday 06 January 2026 00:20:43 +0000 (0:00:00.161) 0:00:06.357 ******* 2026-01-06 00:20:49.699596 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGTU78xcf8kOjfDc9oqrQImFsmVq4cIvZ72dTKWFfZ2/ZVaDt3jhoboztP3TRUI4q0EOUGm3+MK6nLjkHiUn1vQ=) 2026-01-06 00:20:49.699614 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1IXGARsLW8ay6GOrIfTe+DCX0t5Ae8vszhjx5LJsCGWb7+YFfGWRvtAG09gJZNyeg0c9B4eFi0I/NBARiQ1EU3cAJqKcec/zxnGWYiEh2mdCvQf9hFGj36Mq0JJoRuPzA/JQboPjYUsynRmX5X37vKp+IdoudcAJac22Euhea0gD95hVdg/td//5wGo70xFR2taiX9iWSmGT2j7BEtrCp1FuAknoJa6lbETGnZ4CeFG1JbTUuigwXS/3ehUOcas5kigVCdbYcnETz9BATrTy2SNqjONEsKO6iksf4GDLhz6DOkvLctHLj5mDqGW8nRKIF3Uhl/fvTD1e3bfKeAJVT59bPPS+gL1RYgSKQCLrkiLSmByv9GiP9Vtjcf5+JUucNVySHLm6sH4xQWIpDg1l3A7c/zAd6MHUrOPlFcpf/+vd0EhJZYni2YVHGPRJn95iGN4As7IaPsXDWBNVf/bExGgygAQhvoWGR6/G0ccsfq3J42/StUIKssY2jxkqtke0=) 2026-01-06 00:20:49.699629 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEhEOo+/pxbPjRbdjmSdlIi8KyfE9bQ8pOoyOTIFXYxC) 2026-01-06 00:20:49.699642 | orchestrator | 2026-01-06 00:20:49.699653 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.699664 | orchestrator | Tuesday 06 January 2026 00:20:44 +0000 (0:00:01.165) 0:00:07.522 ******* 2026-01-06 00:20:49.699675 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMtws5OlKBQhMYg4Ry7UsNr3h9KZoFceoMtM5eZIVPDkwd7xg7byhpL1V/vouhXAMOwcSzLY9gWR1OXBUcbBEhE=) 2026-01-06 00:20:49.699724 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCv0LHPsWj4OW4XIkQJCvQvJiC492LB2+JM4yt3udN2ZHsfhguU/isXiBpg0nuCeclQc9gtQ0dVjIeNFEAgqoVq8OnPsCfuVM7YV1ahZVuhJmz/g4jsPET90nChFqBgnX//1aKSOzmdnloLVawfROIbv56MAa61Uxn9LDCpFFiBYs8/wmuw+sk5NeXvQPDDQB4NsOG14q/uJIFL2nDPWD17fsC98GCMr1/zI/2FO9rqHKJs7hNCeqSyv4g48cyX7aS52Bjqs4Uqw/qvZMrv2p0pIvDlhB8czdy9xyWxmYYQ8Ixm+NaxOAvNcpeuHwqWeH9ST7iGBr8rU6cUIBnS33e1ExgKcsIQV+bYeF9G7mQ5DmFHAJNzgMsu3rz5FD03e2Z/0NJELbOuppFSDUO3HNrQ4fnZT+DOI0dd9H9pGJct1RZvS0dMDAosi6ZzQvpXI2v0hKep7HVvYE5Bn6fQEelVH2Z5fcRy3keXq12jG1KF5VDTyIMVFQXXIQRxjYrGdhk=) 2026-01-06 00:20:49.699738 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAS8LTa+Z2io9VaiGb6aA59SyFUBq4xJdwZPtIGEAZZ+) 2026-01-06 00:20:49.699749 | orchestrator | 2026-01-06 00:20:49.699760 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.699771 | orchestrator | Tuesday 06 January 2026 00:20:45 +0000 (0:00:01.057) 0:00:08.580 ******* 2026-01-06 00:20:49.699781 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBoyNlFRSO9DGx+MJQqs+qMZ1eK/q8FTLlFXL40+hwbbSiptJ+Zn5zZtWPTDYx1BGqBY3ab7NTLloyHF5SttCfI=) 2026-01-06 00:20:49.699860 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4LDzUNCPtxvHr5oLPJJbxVJJclJ2yPHuiT9QQrTFLSR/GanTcUhsEx4wJXyWTpAhZmYKgksOmwB0L/5Ib/RDO6D8SNYuU4hviUtFvFHAXIZRT5jeTwl1Z86qntF7+MlUCZcg8xxj9stblY8BQoxhlKU5c6nLIo0t5H/cJjE2/Sda87j/tqMb7tF8dInVMfiskFlxKOaPkmaI8tOi8uXCr+HRflr1Tb/d85Hd1J+hBOSefEFGppX/+MYziY/JbkWU+BORWq7bdSil/8R4WW9y9yciSb1Qvr32ZxbGMHQ4QcQSLsnTBKfooa6cxLY569rd+fQd7/P91ePZo61I6BMeVja1Q40hYjup+x2pB3Ui7bFN4X9+kEZ8l3DOXiGm/LuDEH0K4Q/wL119IO4jTmbBxpLomQEYa4XCkg9oqGGX4QirIUwXhYTTgk0GUazewc9AwN432ko5Z3nDpNu9og1ZwF4NJZ8FHzjIkH5XRzKIMcWiLzFfJ9UCcO2nOdAjFfQE=) 2026-01-06 00:20:49.699872 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAcNjk9r9I4TPW+lrE1lKiNz6aLvGzw7zHJ+sXb5oCnp) 2026-01-06 00:20:49.699883 | orchestrator | 2026-01-06 00:20:49.699894 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.699904 | orchestrator | Tuesday 06 January 2026 00:20:46 +0000 (0:00:01.013) 0:00:09.593 ******* 2026-01-06 00:20:49.699920 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1BVLsgC67+S1KmrBsfXmeurQwFMOygH2vN29aHgsOwIkNz16n+oAjUk414WLQWglJgCFidW+qG0/ZFC8ye6EsDunbtFOS+E5Dv/wqX2w5FpODb7u3N6nM9O5KGrqaBsoElzP5MfUurY99DQpz2g1C/escE845fBPddcSaCI4XJ3mfJG6wD0ANOE1ygtQPCBp0RYnJxJ/WDV5hYQYQ9ouNZ8l4ZTlyLLcQUrPio9bmuheNsMzqDQILfxbkQUcMktllBfbw+RETbEvj7/H7NjhubgzWfbSYFvziNgKVon49adjiTiB2WCb2iOZmZHJAaVovt1udsEXnlTbItuJQmhfaPeO41qJ6pvm2jcTdAVoV1SViDr0yO8qeHVil7KYoKND7Hrw+oVfhiiaXpr3RTVHKBanldeXlnQyJk2YCYDJQLGFJPxbtuBZe9Ko/acnxi80Y+/eZIIPLScXdoCV7YIVVBAD8ySrEQRSXpbseHS6mF2XySnD2gxcz76tBWldlgNs=) 2026-01-06 00:20:49.699962 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNl73D8VK+R0aeNKRQrtp0hQSN7N8lWzfq4WUBrrmXSjSHRmlKUdgzc3yv4veJQQAVhmZU3thbOp87D2p1o2uuc=) 2026-01-06 00:20:49.699984 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAVAj7h2j38+GebqNyOzUBnKsPp1QBDtFA7PHVd59Ac5) 2026-01-06 00:20:49.700004 | orchestrator | 2026-01-06 00:20:49.700022 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.700041 | orchestrator | Tuesday 06 January 2026 00:20:47 +0000 (0:00:01.045) 0:00:10.639 ******* 2026-01-06 00:20:49.700054 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDinF55bwfFfq574sooFF15w8z0esL8rn+MU73f0og/svRP2a5W/Rl/Z5jqLD+OxDfyFWc8IvPMTWjVUPCPzLePKb/PLJhRn7vee8bgNKuOwllLf4gcdu0K4fXM0DSKovOGQSE0GM6ZfnAt3Ih2ahdSHjyMFl8O/IuikF93Y2IMWswNUXjkJHnyue7bxCYgNsk7DJCJNKZwR+j7rbfjpbmmTn1UOF/ARWB/5K8yHlpSCjlmYswrGA5fxTjlsT65FgNTTx5kGgc48o5KzXjYO2+52R9W3peWiQHqjOfnTuPMvNzNE9DT6c7jbUCV9H6ZyRE+NLQFoKjq2mKYI6Rr7QF+AxJKeqcr1ozrenZrLY0H4NVm8Zk7xBDY8WlFgBQgWOMcI5/RAH9Mcle4jYgYA/1H3VqbviG9sS7G4/JuyzP6T8a8eGFsmM+3QC/JZboQMYnhsDrVkGTm5AlYBkCRRE1hwSZIXSvBXG6JDU/uvyyJD2AA6SLTFH/KbZ8Ii2Mfavc=) 2026-01-06 00:20:49.700073 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCkM0CUZtI76b4kKdy0H37YXeeMFXQTxYi8pKe3FSiZAC5RQyAZtkCdqfsWjgRjacsYYcSnF0QhIKAOzVSE+nGM=) 2026-01-06 00:20:49.700084 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBUnIU4xhYNkmGXNix5oY1gjDC6by3sZT9GOdc9b5Hlt) 2026-01-06 00:20:49.700095 | orchestrator | 2026-01-06 00:20:49.700106 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:20:49.700117 | orchestrator | Tuesday 06 January 2026 00:20:48 +0000 (0:00:01.048) 0:00:11.687 ******* 2026-01-06 00:20:49.700138 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnwHEXUXaejjglpspT2mvxThaWvLEt/ccG5C1XGCdce2UHWiuzIAOK2Jt0FGATYeWUBxBB1Wsn0eEqaMgSUJ9zUb+q1q/rr2Yp+kfPAlAfdMrFg1+20j1r8TvoMz+272OxlLxdWIId3boCKWv1sfCcbgf8y1A4AugLkWqzA7xeOKSK4K4IHJmrP5W+AMteMWyMwd89jOZ+InsO9ieNF9hX1p9r3Sy6wXgJGlY49aunxx2RGriX4FPpUx31owQDBhDZ6ORiNDePsz2yUHvTLAS49U/RgAgStPmL59Ah5ULM4sM0EgzJ1ARHbdCWJWOAatKus7N19FiHF4rlyVPqCU3b/o+sU7SQq/UrAo9bPiZKuG4YIpQrGVaYOqtI3qV6r/ttR0ZklnOD57cJAInLy6JlBxxe9mmecINvVjtONN1upUgfifLswR/h++KBLikT4B7K2zaCiCa5ginUKROhJPWQGExDyMvN09W81UxHxIo7eP3m6ehMS3XYY37La6m1kNk=) 2026-01-06 00:21:00.529429 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHLoHD39XOIMmoIZFI2ke8Dio6ti/y8cqwc972MFWiq7+cqlEydaEMz65mNYD1AOrbqRbI4PwvACbe8056JGh+c=) 2026-01-06 00:21:01.449159 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIFDsOwtMbFy2kvs56aE85JjTvK/rNf0ORPpJbhW1XFC) 2026-01-06 00:21:01.449255 | orchestrator | 2026-01-06 00:21:01.449273 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:01.449287 | orchestrator | Tuesday 06 January 2026 00:20:49 +0000 (0:00:01.059) 0:00:12.747 ******* 2026-01-06 00:21:01.449302 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBwP/WInNAbwRdlgiGLkkjG/b2IGSFz+usU2xWaI0Jz7gSefZtqInWcBXMoO0NerKBuFXiowrIAtc1a2OzN2ysuGdaEvg74n7a5MxHXlTFBuKQ/gIMU2Iq+r91ahYp5au+zF+CGiRMhGgJBqxKWj/nQOKx+j42fBfE4ThdhCl9tJ/lRznitj7xT9y/LfzuSrr0MtIBh1+s7YMRB7nUuQitWbMal79+C1Z5pOLgR2yY8+FIGDY7QEb5a4AGhusYVtv7ROIiujqOXBlX5allc7vIOspaloewHBmwB0Gs6pIT9p7E8HmVivFm+B8eb/SHB6VWcIIzUrABaPEBhSTajUAV7bbEwGzOsbfkkGu+ZWpKshcpuNLqB4g9g5arYP+jHbXcMyEaBUtBrfH5VfjuPIlxfHqbO6ZKP/oTSHAexN2cVJstEbJgTRBa8jXYpTGAlHlK6k05B81gmjPgla6YYH+hOyZkfevDcUY5l9f7LlovY+mO/FPKTLI0wAmf4/2r8zs=) 2026-01-06 00:21:01.449319 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJtWAzzs0+03k3pAssPYBQYqF93qn99c+x4Na064/6mTIdPY1UxZTC+UAe4D4gxuyuCoD1Ys7Px/PDRUGCmW5ZQ=) 2026-01-06 00:21:01.449331 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHmqbuUxrqYNrFJRDNRli8O4PHxbmdEtFHWxJyUumG8d) 2026-01-06 00:21:01.449343 | orchestrator | 2026-01-06 00:21:01.449355 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2026-01-06 00:21:01.449368 | orchestrator | Tuesday 06 January 2026 00:20:50 +0000 (0:00:01.065) 0:00:13.812 ******* 2026-01-06 00:21:01.449380 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-01-06 00:21:01.449392 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-01-06 00:21:01.449403 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-01-06 00:21:01.449440 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-01-06 00:21:01.449452 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-01-06 00:21:01.449493 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-01-06 00:21:01.449505 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-01-06 00:21:01.449516 | orchestrator | 2026-01-06 00:21:01.449527 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2026-01-06 00:21:01.449540 | orchestrator | Tuesday 06 January 2026 00:20:56 +0000 (0:00:05.335) 0:00:19.148 ******* 2026-01-06 00:21:01.449553 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-01-06 00:21:01.449566 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-01-06 00:21:01.449577 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-01-06 00:21:01.449588 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-01-06 00:21:01.449599 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-01-06 00:21:01.449610 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-01-06 00:21:01.449621 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-01-06 00:21:01.449632 | orchestrator | 2026-01-06 00:21:01.449643 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:01.449654 | orchestrator | Tuesday 06 January 2026 00:20:56 +0000 (0:00:00.184) 0:00:19.333 ******* 2026-01-06 00:21:01.449665 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEhEOo+/pxbPjRbdjmSdlIi8KyfE9bQ8pOoyOTIFXYxC) 2026-01-06 00:21:01.449718 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1IXGARsLW8ay6GOrIfTe+DCX0t5Ae8vszhjx5LJsCGWb7+YFfGWRvtAG09gJZNyeg0c9B4eFi0I/NBARiQ1EU3cAJqKcec/zxnGWYiEh2mdCvQf9hFGj36Mq0JJoRuPzA/JQboPjYUsynRmX5X37vKp+IdoudcAJac22Euhea0gD95hVdg/td//5wGo70xFR2taiX9iWSmGT2j7BEtrCp1FuAknoJa6lbETGnZ4CeFG1JbTUuigwXS/3ehUOcas5kigVCdbYcnETz9BATrTy2SNqjONEsKO6iksf4GDLhz6DOkvLctHLj5mDqGW8nRKIF3Uhl/fvTD1e3bfKeAJVT59bPPS+gL1RYgSKQCLrkiLSmByv9GiP9Vtjcf5+JUucNVySHLm6sH4xQWIpDg1l3A7c/zAd6MHUrOPlFcpf/+vd0EhJZYni2YVHGPRJn95iGN4As7IaPsXDWBNVf/bExGgygAQhvoWGR6/G0ccsfq3J42/StUIKssY2jxkqtke0=) 2026-01-06 00:21:01.449731 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGTU78xcf8kOjfDc9oqrQImFsmVq4cIvZ72dTKWFfZ2/ZVaDt3jhoboztP3TRUI4q0EOUGm3+MK6nLjkHiUn1vQ=) 2026-01-06 00:21:01.449742 | orchestrator | 2026-01-06 00:21:01.449753 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:01.449763 | orchestrator | Tuesday 06 January 2026 00:20:57 +0000 (0:00:01.073) 0:00:20.407 ******* 2026-01-06 00:21:01.449856 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCv0LHPsWj4OW4XIkQJCvQvJiC492LB2+JM4yt3udN2ZHsfhguU/isXiBpg0nuCeclQc9gtQ0dVjIeNFEAgqoVq8OnPsCfuVM7YV1ahZVuhJmz/g4jsPET90nChFqBgnX//1aKSOzmdnloLVawfROIbv56MAa61Uxn9LDCpFFiBYs8/wmuw+sk5NeXvQPDDQB4NsOG14q/uJIFL2nDPWD17fsC98GCMr1/zI/2FO9rqHKJs7hNCeqSyv4g48cyX7aS52Bjqs4Uqw/qvZMrv2p0pIvDlhB8czdy9xyWxmYYQ8Ixm+NaxOAvNcpeuHwqWeH9ST7iGBr8rU6cUIBnS33e1ExgKcsIQV+bYeF9G7mQ5DmFHAJNzgMsu3rz5FD03e2Z/0NJELbOuppFSDUO3HNrQ4fnZT+DOI0dd9H9pGJct1RZvS0dMDAosi6ZzQvpXI2v0hKep7HVvYE5Bn6fQEelVH2Z5fcRy3keXq12jG1KF5VDTyIMVFQXXIQRxjYrGdhk=) 2026-01-06 00:21:01.449879 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMtws5OlKBQhMYg4Ry7UsNr3h9KZoFceoMtM5eZIVPDkwd7xg7byhpL1V/vouhXAMOwcSzLY9gWR1OXBUcbBEhE=) 2026-01-06 00:21:01.449890 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAS8LTa+Z2io9VaiGb6aA59SyFUBq4xJdwZPtIGEAZZ+) 2026-01-06 00:21:01.449901 | orchestrator | 2026-01-06 00:21:01.449912 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:01.449923 | orchestrator | Tuesday 06 January 2026 00:20:58 +0000 (0:00:01.049) 0:00:21.456 ******* 2026-01-06 00:21:01.449934 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBoyNlFRSO9DGx+MJQqs+qMZ1eK/q8FTLlFXL40+hwbbSiptJ+Zn5zZtWPTDYx1BGqBY3ab7NTLloyHF5SttCfI=) 2026-01-06 00:21:01.449965 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAcNjk9r9I4TPW+lrE1lKiNz6aLvGzw7zHJ+sXb5oCnp) 2026-01-06 00:21:01.449976 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4LDzUNCPtxvHr5oLPJJbxVJJclJ2yPHuiT9QQrTFLSR/GanTcUhsEx4wJXyWTpAhZmYKgksOmwB0L/5Ib/RDO6D8SNYuU4hviUtFvFHAXIZRT5jeTwl1Z86qntF7+MlUCZcg8xxj9stblY8BQoxhlKU5c6nLIo0t5H/cJjE2/Sda87j/tqMb7tF8dInVMfiskFlxKOaPkmaI8tOi8uXCr+HRflr1Tb/d85Hd1J+hBOSefEFGppX/+MYziY/JbkWU+BORWq7bdSil/8R4WW9y9yciSb1Qvr32ZxbGMHQ4QcQSLsnTBKfooa6cxLY569rd+fQd7/P91ePZo61I6BMeVja1Q40hYjup+x2pB3Ui7bFN4X9+kEZ8l3DOXiGm/LuDEH0K4Q/wL119IO4jTmbBxpLomQEYa4XCkg9oqGGX4QirIUwXhYTTgk0GUazewc9AwN432ko5Z3nDpNu9og1ZwF4NJZ8FHzjIkH5XRzKIMcWiLzFfJ9UCcO2nOdAjFfQE=) 2026-01-06 00:21:01.449988 | orchestrator | 2026-01-06 00:21:01.449998 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:01.450009 | orchestrator | Tuesday 06 January 2026 00:20:59 +0000 (0:00:01.061) 0:00:22.518 ******* 2026-01-06 00:21:01.450080 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1BVLsgC67+S1KmrBsfXmeurQwFMOygH2vN29aHgsOwIkNz16n+oAjUk414WLQWglJgCFidW+qG0/ZFC8ye6EsDunbtFOS+E5Dv/wqX2w5FpODb7u3N6nM9O5KGrqaBsoElzP5MfUurY99DQpz2g1C/escE845fBPddcSaCI4XJ3mfJG6wD0ANOE1ygtQPCBp0RYnJxJ/WDV5hYQYQ9ouNZ8l4ZTlyLLcQUrPio9bmuheNsMzqDQILfxbkQUcMktllBfbw+RETbEvj7/H7NjhubgzWfbSYFvziNgKVon49adjiTiB2WCb2iOZmZHJAaVovt1udsEXnlTbItuJQmhfaPeO41qJ6pvm2jcTdAVoV1SViDr0yO8qeHVil7KYoKND7Hrw+oVfhiiaXpr3RTVHKBanldeXlnQyJk2YCYDJQLGFJPxbtuBZe9Ko/acnxi80Y+/eZIIPLScXdoCV7YIVVBAD8ySrEQRSXpbseHS6mF2XySnD2gxcz76tBWldlgNs=) 2026-01-06 00:21:01.450093 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNl73D8VK+R0aeNKRQrtp0hQSN7N8lWzfq4WUBrrmXSjSHRmlKUdgzc3yv4veJQQAVhmZU3thbOp87D2p1o2uuc=) 2026-01-06 00:21:01.450120 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAVAj7h2j38+GebqNyOzUBnKsPp1QBDtFA7PHVd59Ac5) 2026-01-06 00:21:04.893071 | orchestrator | 2026-01-06 00:21:04.893204 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:04.893224 | orchestrator | Tuesday 06 January 2026 00:21:00 +0000 (0:00:01.055) 0:00:23.574 ******* 2026-01-06 00:21:04.893239 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCkM0CUZtI76b4kKdy0H37YXeeMFXQTxYi8pKe3FSiZAC5RQyAZtkCdqfsWjgRjacsYYcSnF0QhIKAOzVSE+nGM=) 2026-01-06 00:21:04.893257 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDinF55bwfFfq574sooFF15w8z0esL8rn+MU73f0og/svRP2a5W/Rl/Z5jqLD+OxDfyFWc8IvPMTWjVUPCPzLePKb/PLJhRn7vee8bgNKuOwllLf4gcdu0K4fXM0DSKovOGQSE0GM6ZfnAt3Ih2ahdSHjyMFl8O/IuikF93Y2IMWswNUXjkJHnyue7bxCYgNsk7DJCJNKZwR+j7rbfjpbmmTn1UOF/ARWB/5K8yHlpSCjlmYswrGA5fxTjlsT65FgNTTx5kGgc48o5KzXjYO2+52R9W3peWiQHqjOfnTuPMvNzNE9DT6c7jbUCV9H6ZyRE+NLQFoKjq2mKYI6Rr7QF+AxJKeqcr1ozrenZrLY0H4NVm8Zk7xBDY8WlFgBQgWOMcI5/RAH9Mcle4jYgYA/1H3VqbviG9sS7G4/JuyzP6T8a8eGFsmM+3QC/JZboQMYnhsDrVkGTm5AlYBkCRRE1hwSZIXSvBXG6JDU/uvyyJD2AA6SLTFH/KbZ8Ii2Mfavc=) 2026-01-06 00:21:04.893302 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBUnIU4xhYNkmGXNix5oY1gjDC6by3sZT9GOdc9b5Hlt) 2026-01-06 00:21:04.893316 | orchestrator | 2026-01-06 00:21:04.893328 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:04.893339 | orchestrator | Tuesday 06 January 2026 00:21:01 +0000 (0:00:01.058) 0:00:24.633 ******* 2026-01-06 00:21:04.893367 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnwHEXUXaejjglpspT2mvxThaWvLEt/ccG5C1XGCdce2UHWiuzIAOK2Jt0FGATYeWUBxBB1Wsn0eEqaMgSUJ9zUb+q1q/rr2Yp+kfPAlAfdMrFg1+20j1r8TvoMz+272OxlLxdWIId3boCKWv1sfCcbgf8y1A4AugLkWqzA7xeOKSK4K4IHJmrP5W+AMteMWyMwd89jOZ+InsO9ieNF9hX1p9r3Sy6wXgJGlY49aunxx2RGriX4FPpUx31owQDBhDZ6ORiNDePsz2yUHvTLAS49U/RgAgStPmL59Ah5ULM4sM0EgzJ1ARHbdCWJWOAatKus7N19FiHF4rlyVPqCU3b/o+sU7SQq/UrAo9bPiZKuG4YIpQrGVaYOqtI3qV6r/ttR0ZklnOD57cJAInLy6JlBxxe9mmecINvVjtONN1upUgfifLswR/h++KBLikT4B7K2zaCiCa5ginUKROhJPWQGExDyMvN09W81UxHxIo7eP3m6ehMS3XYY37La6m1kNk=) 2026-01-06 00:21:04.893379 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHLoHD39XOIMmoIZFI2ke8Dio6ti/y8cqwc972MFWiq7+cqlEydaEMz65mNYD1AOrbqRbI4PwvACbe8056JGh+c=) 2026-01-06 00:21:04.893391 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIFDsOwtMbFy2kvs56aE85JjTvK/rNf0ORPpJbhW1XFC) 2026-01-06 00:21:04.893402 | orchestrator | 2026-01-06 00:21:04.893413 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-01-06 00:21:04.893424 | orchestrator | Tuesday 06 January 2026 00:21:02 +0000 (0:00:01.048) 0:00:25.681 ******* 2026-01-06 00:21:04.893436 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBwP/WInNAbwRdlgiGLkkjG/b2IGSFz+usU2xWaI0Jz7gSefZtqInWcBXMoO0NerKBuFXiowrIAtc1a2OzN2ysuGdaEvg74n7a5MxHXlTFBuKQ/gIMU2Iq+r91ahYp5au+zF+CGiRMhGgJBqxKWj/nQOKx+j42fBfE4ThdhCl9tJ/lRznitj7xT9y/LfzuSrr0MtIBh1+s7YMRB7nUuQitWbMal79+C1Z5pOLgR2yY8+FIGDY7QEb5a4AGhusYVtv7ROIiujqOXBlX5allc7vIOspaloewHBmwB0Gs6pIT9p7E8HmVivFm+B8eb/SHB6VWcIIzUrABaPEBhSTajUAV7bbEwGzOsbfkkGu+ZWpKshcpuNLqB4g9g5arYP+jHbXcMyEaBUtBrfH5VfjuPIlxfHqbO6ZKP/oTSHAexN2cVJstEbJgTRBa8jXYpTGAlHlK6k05B81gmjPgla6YYH+hOyZkfevDcUY5l9f7LlovY+mO/FPKTLI0wAmf4/2r8zs=) 2026-01-06 00:21:04.893447 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJtWAzzs0+03k3pAssPYBQYqF93qn99c+x4Na064/6mTIdPY1UxZTC+UAe4D4gxuyuCoD1Ys7Px/PDRUGCmW5ZQ=) 2026-01-06 00:21:04.893459 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHmqbuUxrqYNrFJRDNRli8O4PHxbmdEtFHWxJyUumG8d) 2026-01-06 00:21:04.893470 | orchestrator | 2026-01-06 00:21:04.893481 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2026-01-06 00:21:04.893492 | orchestrator | Tuesday 06 January 2026 00:21:03 +0000 (0:00:01.074) 0:00:26.755 ******* 2026-01-06 00:21:04.893503 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-01-06 00:21:04.893515 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-01-06 00:21:04.893526 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-01-06 00:21:04.893537 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-01-06 00:21:04.893550 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-01-06 00:21:04.893564 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-01-06 00:21:04.893577 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-01-06 00:21:04.893590 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:21:04.893603 | orchestrator | 2026-01-06 00:21:04.893635 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2026-01-06 00:21:04.893657 | orchestrator | Tuesday 06 January 2026 00:21:03 +0000 (0:00:00.171) 0:00:26.927 ******* 2026-01-06 00:21:04.893670 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:21:04.893682 | orchestrator | 2026-01-06 00:21:04.893696 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2026-01-06 00:21:04.893708 | orchestrator | Tuesday 06 January 2026 00:21:03 +0000 (0:00:00.060) 0:00:26.988 ******* 2026-01-06 00:21:04.893721 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:21:04.893734 | orchestrator | 2026-01-06 00:21:04.893747 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2026-01-06 00:21:04.893758 | orchestrator | Tuesday 06 January 2026 00:21:03 +0000 (0:00:00.064) 0:00:27.053 ******* 2026-01-06 00:21:04.893769 | orchestrator | changed: [testbed-manager] 2026-01-06 00:21:04.893780 | orchestrator | 2026-01-06 00:21:04.893791 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:21:04.893802 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-01-06 00:21:04.893815 | orchestrator | 2026-01-06 00:21:04.893826 | orchestrator | 2026-01-06 00:21:04.893837 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:21:04.893848 | orchestrator | Tuesday 06 January 2026 00:21:04 +0000 (0:00:00.702) 0:00:27.756 ******* 2026-01-06 00:21:04.893859 | orchestrator | =============================================================================== 2026-01-06 00:21:04.893870 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.03s 2026-01-06 00:21:04.893881 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.34s 2026-01-06 00:21:04.893892 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.17s 2026-01-06 00:21:04.893903 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2026-01-06 00:21:04.893914 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2026-01-06 00:21:04.893925 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2026-01-06 00:21:04.893936 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2026-01-06 00:21:04.893970 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2026-01-06 00:21:04.893982 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2026-01-06 00:21:04.893993 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2026-01-06 00:21:04.894004 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2026-01-06 00:21:04.894081 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2026-01-06 00:21:04.894110 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2026-01-06 00:21:04.894122 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2026-01-06 00:21:04.894135 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2026-01-06 00:21:04.894155 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.01s 2026-01-06 00:21:04.894174 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.70s 2026-01-06 00:21:04.894194 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.18s 2026-01-06 00:21:04.894214 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.17s 2026-01-06 00:21:04.894233 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.16s 2026-01-06 00:21:05.204405 | orchestrator | + osism apply squid 2026-01-06 00:21:17.257705 | orchestrator | 2026-01-06 00:21:17 | INFO  | Task 57e261c0-b604-4207-b898-9f6172a3c607 (squid) was prepared for execution. 2026-01-06 00:21:17.257802 | orchestrator | 2026-01-06 00:21:17 | INFO  | It takes a moment until task 57e261c0-b604-4207-b898-9f6172a3c607 (squid) has been started and output is visible here. 2026-01-06 00:23:11.712883 | orchestrator | 2026-01-06 00:23:11.713073 | orchestrator | PLAY [Apply role squid] ******************************************************** 2026-01-06 00:23:11.713093 | orchestrator | 2026-01-06 00:23:11.713105 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2026-01-06 00:23:11.713117 | orchestrator | Tuesday 06 January 2026 00:21:21 +0000 (0:00:00.177) 0:00:00.177 ******* 2026-01-06 00:23:11.713129 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:23:11.713141 | orchestrator | 2026-01-06 00:23:11.713153 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2026-01-06 00:23:11.713164 | orchestrator | Tuesday 06 January 2026 00:21:21 +0000 (0:00:00.087) 0:00:00.264 ******* 2026-01-06 00:23:11.713175 | orchestrator | ok: [testbed-manager] 2026-01-06 00:23:11.713187 | orchestrator | 2026-01-06 00:23:11.713198 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2026-01-06 00:23:11.713209 | orchestrator | Tuesday 06 January 2026 00:21:22 +0000 (0:00:01.454) 0:00:01.719 ******* 2026-01-06 00:23:11.713220 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2026-01-06 00:23:11.713231 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2026-01-06 00:23:11.713242 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2026-01-06 00:23:11.713253 | orchestrator | 2026-01-06 00:23:11.713264 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2026-01-06 00:23:11.713275 | orchestrator | Tuesday 06 January 2026 00:21:24 +0000 (0:00:01.163) 0:00:02.882 ******* 2026-01-06 00:23:11.713286 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2026-01-06 00:23:11.713297 | orchestrator | 2026-01-06 00:23:11.713307 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2026-01-06 00:23:11.713318 | orchestrator | Tuesday 06 January 2026 00:21:25 +0000 (0:00:01.058) 0:00:03.940 ******* 2026-01-06 00:23:11.713329 | orchestrator | ok: [testbed-manager] 2026-01-06 00:23:11.713340 | orchestrator | 2026-01-06 00:23:11.713351 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2026-01-06 00:23:11.713361 | orchestrator | Tuesday 06 January 2026 00:21:25 +0000 (0:00:00.343) 0:00:04.284 ******* 2026-01-06 00:23:11.713372 | orchestrator | changed: [testbed-manager] 2026-01-06 00:23:11.713383 | orchestrator | 2026-01-06 00:23:11.713394 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2026-01-06 00:23:11.713407 | orchestrator | Tuesday 06 January 2026 00:21:26 +0000 (0:00:00.935) 0:00:05.219 ******* 2026-01-06 00:23:11.713420 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2026-01-06 00:23:11.713433 | orchestrator | ok: [testbed-manager] 2026-01-06 00:23:11.713445 | orchestrator | 2026-01-06 00:23:11.713458 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2026-01-06 00:23:11.713472 | orchestrator | Tuesday 06 January 2026 00:21:58 +0000 (0:00:32.102) 0:00:37.322 ******* 2026-01-06 00:23:11.713485 | orchestrator | changed: [testbed-manager] 2026-01-06 00:23:11.713497 | orchestrator | 2026-01-06 00:23:11.713510 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2026-01-06 00:23:11.713523 | orchestrator | Tuesday 06 January 2026 00:22:10 +0000 (0:00:12.020) 0:00:49.342 ******* 2026-01-06 00:23:11.713536 | orchestrator | Pausing for 60 seconds 2026-01-06 00:23:11.713549 | orchestrator | changed: [testbed-manager] 2026-01-06 00:23:11.713562 | orchestrator | 2026-01-06 00:23:11.713575 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2026-01-06 00:23:11.713588 | orchestrator | Tuesday 06 January 2026 00:23:10 +0000 (0:01:00.104) 0:01:49.446 ******* 2026-01-06 00:23:11.713601 | orchestrator | ok: [testbed-manager] 2026-01-06 00:23:11.713614 | orchestrator | 2026-01-06 00:23:11.713628 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2026-01-06 00:23:11.713675 | orchestrator | Tuesday 06 January 2026 00:23:10 +0000 (0:00:00.073) 0:01:49.519 ******* 2026-01-06 00:23:11.713688 | orchestrator | changed: [testbed-manager] 2026-01-06 00:23:11.713700 | orchestrator | 2026-01-06 00:23:11.713714 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:23:11.713727 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:23:11.713740 | orchestrator | 2026-01-06 00:23:11.713753 | orchestrator | 2026-01-06 00:23:11.713764 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:23:11.713775 | orchestrator | Tuesday 06 January 2026 00:23:11 +0000 (0:00:00.622) 0:01:50.142 ******* 2026-01-06 00:23:11.713786 | orchestrator | =============================================================================== 2026-01-06 00:23:11.713796 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.10s 2026-01-06 00:23:11.713807 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 32.10s 2026-01-06 00:23:11.713818 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.02s 2026-01-06 00:23:11.713828 | orchestrator | osism.services.squid : Install required packages ------------------------ 1.45s 2026-01-06 00:23:11.713839 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.16s 2026-01-06 00:23:11.713849 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.06s 2026-01-06 00:23:11.713860 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 0.94s 2026-01-06 00:23:11.713871 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.62s 2026-01-06 00:23:11.713881 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.34s 2026-01-06 00:23:11.713892 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.09s 2026-01-06 00:23:11.713903 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.07s 2026-01-06 00:23:12.119702 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2026-01-06 00:23:12.119787 | orchestrator | + /opt/configuration/scripts/set-kolla-namespace.sh kolla 2026-01-06 00:23:12.127391 | orchestrator | + set -e 2026-01-06 00:23:12.127440 | orchestrator | + NAMESPACE=kolla 2026-01-06 00:23:12.127446 | orchestrator | + sed -i 's#docker_namespace: .*#docker_namespace: kolla#g' /opt/configuration/inventory/group_vars/all/kolla.yml 2026-01-06 00:23:12.132661 | orchestrator | ++ semver latest 9.0.0 2026-01-06 00:23:12.199221 | orchestrator | + [[ -1 -lt 0 ]] 2026-01-06 00:23:12.199324 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2026-01-06 00:23:12.199734 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2026-01-06 00:23:24.296872 | orchestrator | 2026-01-06 00:23:24 | INFO  | Task ba78b76a-153e-434c-a63c-37e929f0d2ef (operator) was prepared for execution. 2026-01-06 00:23:24.296985 | orchestrator | 2026-01-06 00:23:24 | INFO  | It takes a moment until task ba78b76a-153e-434c-a63c-37e929f0d2ef (operator) has been started and output is visible here. 2026-01-06 00:23:40.897166 | orchestrator | 2026-01-06 00:23:40.897302 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2026-01-06 00:23:40.897327 | orchestrator | 2026-01-06 00:23:40.897339 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-01-06 00:23:40.897351 | orchestrator | Tuesday 06 January 2026 00:23:28 +0000 (0:00:00.142) 0:00:00.142 ******* 2026-01-06 00:23:40.897364 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:23:40.897376 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:23:40.897387 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:23:40.897398 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:23:40.897409 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:23:40.897425 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:23:40.897436 | orchestrator | 2026-01-06 00:23:40.897447 | orchestrator | TASK [Do not require tty for all users] **************************************** 2026-01-06 00:23:40.897458 | orchestrator | Tuesday 06 January 2026 00:23:31 +0000 (0:00:03.359) 0:00:03.502 ******* 2026-01-06 00:23:40.897500 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:23:40.897512 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:23:40.897523 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:23:40.897534 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:23:40.897544 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:23:40.897555 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:23:40.897571 | orchestrator | 2026-01-06 00:23:40.897587 | orchestrator | PLAY [Apply role operator] ***************************************************** 2026-01-06 00:23:40.897598 | orchestrator | 2026-01-06 00:23:40.897608 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-01-06 00:23:40.897619 | orchestrator | Tuesday 06 January 2026 00:23:32 +0000 (0:00:00.827) 0:00:04.329 ******* 2026-01-06 00:23:40.897630 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:23:40.897643 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:23:40.897655 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:23:40.897668 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:23:40.897680 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:23:40.897693 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:23:40.897705 | orchestrator | 2026-01-06 00:23:40.897718 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-01-06 00:23:40.897731 | orchestrator | Tuesday 06 January 2026 00:23:32 +0000 (0:00:00.163) 0:00:04.493 ******* 2026-01-06 00:23:40.897744 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:23:40.897758 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:23:40.897770 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:23:40.897784 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:23:40.897814 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:23:40.897826 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:23:40.897839 | orchestrator | 2026-01-06 00:23:40.897852 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-01-06 00:23:40.897865 | orchestrator | Tuesday 06 January 2026 00:23:33 +0000 (0:00:00.186) 0:00:04.679 ******* 2026-01-06 00:23:40.897878 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:40.897892 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:40.897909 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:40.897922 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:40.897935 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:40.897949 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:40.897960 | orchestrator | 2026-01-06 00:23:40.897971 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-01-06 00:23:40.897981 | orchestrator | Tuesday 06 January 2026 00:23:33 +0000 (0:00:00.687) 0:00:05.367 ******* 2026-01-06 00:23:40.897992 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:40.898003 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:40.898013 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:40.898111 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:40.898122 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:40.898134 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:40.898144 | orchestrator | 2026-01-06 00:23:40.898155 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-01-06 00:23:40.898166 | orchestrator | Tuesday 06 January 2026 00:23:34 +0000 (0:00:00.836) 0:00:06.204 ******* 2026-01-06 00:23:40.898177 | orchestrator | changed: [testbed-node-0] => (item=adm) 2026-01-06 00:23:40.898189 | orchestrator | changed: [testbed-node-3] => (item=adm) 2026-01-06 00:23:40.898200 | orchestrator | changed: [testbed-node-2] => (item=adm) 2026-01-06 00:23:40.898211 | orchestrator | changed: [testbed-node-4] => (item=adm) 2026-01-06 00:23:40.898221 | orchestrator | changed: [testbed-node-1] => (item=adm) 2026-01-06 00:23:40.898232 | orchestrator | changed: [testbed-node-5] => (item=adm) 2026-01-06 00:23:40.898243 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2026-01-06 00:23:40.898254 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2026-01-06 00:23:40.898264 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2026-01-06 00:23:40.898275 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2026-01-06 00:23:40.898295 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2026-01-06 00:23:40.898306 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2026-01-06 00:23:40.898317 | orchestrator | 2026-01-06 00:23:40.898327 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-01-06 00:23:40.898338 | orchestrator | Tuesday 06 January 2026 00:23:35 +0000 (0:00:01.225) 0:00:07.429 ******* 2026-01-06 00:23:40.898349 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:40.898360 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:40.898370 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:40.898381 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:40.898392 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:40.898402 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:40.898413 | orchestrator | 2026-01-06 00:23:40.898424 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-01-06 00:23:40.898437 | orchestrator | Tuesday 06 January 2026 00:23:37 +0000 (0:00:01.348) 0:00:08.777 ******* 2026-01-06 00:23:40.898447 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2026-01-06 00:23:40.898458 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2026-01-06 00:23:40.898469 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2026-01-06 00:23:40.898480 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898512 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898523 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898534 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898545 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898556 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2026-01-06 00:23:40.898566 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898577 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898588 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898598 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898609 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898620 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2026-01-06 00:23:40.898630 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898641 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898652 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898662 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898673 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898684 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2026-01-06 00:23:40.898694 | orchestrator | 2026-01-06 00:23:40.898705 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-01-06 00:23:40.898717 | orchestrator | Tuesday 06 January 2026 00:23:38 +0000 (0:00:01.239) 0:00:10.017 ******* 2026-01-06 00:23:40.898728 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:40.898739 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:40.898750 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:40.898760 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:40.898771 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:40.898782 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:40.898792 | orchestrator | 2026-01-06 00:23:40.898803 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-01-06 00:23:40.898814 | orchestrator | Tuesday 06 January 2026 00:23:38 +0000 (0:00:00.174) 0:00:10.191 ******* 2026-01-06 00:23:40.898832 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:40.898843 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:40.898854 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:40.898865 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:40.898875 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:40.898886 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:40.898897 | orchestrator | 2026-01-06 00:23:40.898908 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-01-06 00:23:40.898918 | orchestrator | Tuesday 06 January 2026 00:23:38 +0000 (0:00:00.202) 0:00:10.394 ******* 2026-01-06 00:23:40.898929 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:40.898940 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:40.898950 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:40.898961 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:40.898972 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:40.898982 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:40.898993 | orchestrator | 2026-01-06 00:23:40.899004 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-01-06 00:23:40.899015 | orchestrator | Tuesday 06 January 2026 00:23:39 +0000 (0:00:00.622) 0:00:11.017 ******* 2026-01-06 00:23:40.899025 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:40.899065 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:40.899075 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:40.899086 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:40.899097 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:40.899108 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:40.899119 | orchestrator | 2026-01-06 00:23:40.899129 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-01-06 00:23:40.899140 | orchestrator | Tuesday 06 January 2026 00:23:39 +0000 (0:00:00.184) 0:00:11.201 ******* 2026-01-06 00:23:40.899151 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-01-06 00:23:40.899162 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:40.899173 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-01-06 00:23:40.899184 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-01-06 00:23:40.899195 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:40.899205 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:40.899216 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-01-06 00:23:40.899226 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-01-06 00:23:40.899237 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:40.899248 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:40.899258 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-01-06 00:23:40.899269 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:40.899280 | orchestrator | 2026-01-06 00:23:40.899291 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-01-06 00:23:40.899301 | orchestrator | Tuesday 06 January 2026 00:23:40 +0000 (0:00:00.838) 0:00:12.040 ******* 2026-01-06 00:23:40.899312 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:40.899323 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:40.899334 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:40.899344 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:40.899355 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:40.899366 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:40.899376 | orchestrator | 2026-01-06 00:23:40.899387 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-01-06 00:23:40.899398 | orchestrator | Tuesday 06 January 2026 00:23:40 +0000 (0:00:00.245) 0:00:12.285 ******* 2026-01-06 00:23:40.899409 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:40.899420 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:40.899430 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:40.899441 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:40.899460 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:42.293511 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:42.293663 | orchestrator | 2026-01-06 00:23:42.293680 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-01-06 00:23:42.293694 | orchestrator | Tuesday 06 January 2026 00:23:40 +0000 (0:00:00.202) 0:00:12.488 ******* 2026-01-06 00:23:42.293706 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:42.293717 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:42.293728 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:42.293738 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:42.293749 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:42.293760 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:42.293771 | orchestrator | 2026-01-06 00:23:42.293782 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-01-06 00:23:42.293793 | orchestrator | Tuesday 06 January 2026 00:23:41 +0000 (0:00:00.181) 0:00:12.669 ******* 2026-01-06 00:23:42.293804 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:23:42.293815 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:23:42.293825 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:23:42.293836 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:23:42.293847 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:23:42.293857 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:23:42.293868 | orchestrator | 2026-01-06 00:23:42.293879 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-01-06 00:23:42.293890 | orchestrator | Tuesday 06 January 2026 00:23:41 +0000 (0:00:00.694) 0:00:13.363 ******* 2026-01-06 00:23:42.293901 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:23:42.293911 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:23:42.293922 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:23:42.293933 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:23:42.293943 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:23:42.293954 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:23:42.293965 | orchestrator | 2026-01-06 00:23:42.293976 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:23:42.293989 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294098 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294117 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294136 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294150 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294163 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 00:23:42.294175 | orchestrator | 2026-01-06 00:23:42.294187 | orchestrator | 2026-01-06 00:23:42.294200 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:23:42.294213 | orchestrator | Tuesday 06 January 2026 00:23:41 +0000 (0:00:00.236) 0:00:13.599 ******* 2026-01-06 00:23:42.294226 | orchestrator | =============================================================================== 2026-01-06 00:23:42.294239 | orchestrator | Gathering Facts --------------------------------------------------------- 3.36s 2026-01-06 00:23:42.294251 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.35s 2026-01-06 00:23:42.294265 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.24s 2026-01-06 00:23:42.294280 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.23s 2026-01-06 00:23:42.294301 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.84s 2026-01-06 00:23:42.294314 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.84s 2026-01-06 00:23:42.294326 | orchestrator | Do not require tty for all users ---------------------------------------- 0.83s 2026-01-06 00:23:42.294338 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.69s 2026-01-06 00:23:42.294351 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.69s 2026-01-06 00:23:42.294364 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.62s 2026-01-06 00:23:42.294377 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.25s 2026-01-06 00:23:42.294393 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.24s 2026-01-06 00:23:42.294412 | orchestrator | osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file --- 0.20s 2026-01-06 00:23:42.294437 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.20s 2026-01-06 00:23:42.294461 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.19s 2026-01-06 00:23:42.294478 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.18s 2026-01-06 00:23:42.294494 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.18s 2026-01-06 00:23:42.294512 | orchestrator | osism.commons.operator : Set custom environment variables in .bashrc configuration file --- 0.17s 2026-01-06 00:23:42.294529 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.16s 2026-01-06 00:23:42.695605 | orchestrator | + osism apply --environment custom facts 2026-01-06 00:23:44.746591 | orchestrator | 2026-01-06 00:23:44 | INFO  | Trying to run play facts in environment custom 2026-01-06 00:23:54.930405 | orchestrator | 2026-01-06 00:23:54 | INFO  | Task 0127a425-75f7-4aef-8574-59d991536684 (facts) was prepared for execution. 2026-01-06 00:23:54.930519 | orchestrator | 2026-01-06 00:23:54 | INFO  | It takes a moment until task 0127a425-75f7-4aef-8574-59d991536684 (facts) has been started and output is visible here. 2026-01-06 00:24:39.011303 | orchestrator | 2026-01-06 00:24:39.011434 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2026-01-06 00:24:39.011447 | orchestrator | 2026-01-06 00:24:39.011455 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-01-06 00:24:39.011464 | orchestrator | Tuesday 06 January 2026 00:23:59 +0000 (0:00:00.085) 0:00:00.085 ******* 2026-01-06 00:24:39.011471 | orchestrator | ok: [testbed-manager] 2026-01-06 00:24:39.011481 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:24:39.011489 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.011496 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:24:39.011504 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:24:39.011523 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.011626 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.011652 | orchestrator | 2026-01-06 00:24:39.011660 | orchestrator | TASK [Copy fact file] ********************************************************** 2026-01-06 00:24:39.011676 | orchestrator | Tuesday 06 January 2026 00:24:00 +0000 (0:00:01.436) 0:00:01.522 ******* 2026-01-06 00:24:39.011685 | orchestrator | ok: [testbed-manager] 2026-01-06 00:24:39.011692 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:24:39.011700 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:24:39.011708 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.011715 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.011723 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:24:39.011730 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.011737 | orchestrator | 2026-01-06 00:24:39.011745 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2026-01-06 00:24:39.011752 | orchestrator | 2026-01-06 00:24:39.011760 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-01-06 00:24:39.011791 | orchestrator | Tuesday 06 January 2026 00:24:01 +0000 (0:00:01.224) 0:00:02.746 ******* 2026-01-06 00:24:39.011799 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.011806 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.011813 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.011821 | orchestrator | 2026-01-06 00:24:39.011828 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-01-06 00:24:39.011850 | orchestrator | Tuesday 06 January 2026 00:24:01 +0000 (0:00:00.092) 0:00:02.838 ******* 2026-01-06 00:24:39.011857 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.011866 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.011875 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.011883 | orchestrator | 2026-01-06 00:24:39.011892 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-01-06 00:24:39.011901 | orchestrator | Tuesday 06 January 2026 00:24:02 +0000 (0:00:00.196) 0:00:03.035 ******* 2026-01-06 00:24:39.011910 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.011919 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.011928 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.011936 | orchestrator | 2026-01-06 00:24:39.011945 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-01-06 00:24:39.011953 | orchestrator | Tuesday 06 January 2026 00:24:02 +0000 (0:00:00.219) 0:00:03.255 ******* 2026-01-06 00:24:39.011964 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:24:39.011975 | orchestrator | 2026-01-06 00:24:39.011983 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-01-06 00:24:39.011993 | orchestrator | Tuesday 06 January 2026 00:24:02 +0000 (0:00:00.144) 0:00:03.399 ******* 2026-01-06 00:24:39.012002 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.012010 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.012019 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.012028 | orchestrator | 2026-01-06 00:24:39.012037 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-01-06 00:24:39.012045 | orchestrator | Tuesday 06 January 2026 00:24:02 +0000 (0:00:00.461) 0:00:03.861 ******* 2026-01-06 00:24:39.012072 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:24:39.012082 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:24:39.012090 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:24:39.012099 | orchestrator | 2026-01-06 00:24:39.012107 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-01-06 00:24:39.012116 | orchestrator | Tuesday 06 January 2026 00:24:02 +0000 (0:00:00.132) 0:00:03.993 ******* 2026-01-06 00:24:39.012124 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.012133 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.012141 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.012150 | orchestrator | 2026-01-06 00:24:39.012158 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-01-06 00:24:39.012167 | orchestrator | Tuesday 06 January 2026 00:24:04 +0000 (0:00:01.104) 0:00:05.098 ******* 2026-01-06 00:24:39.012175 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.012184 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.012192 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.012200 | orchestrator | 2026-01-06 00:24:39.012209 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-01-06 00:24:39.012218 | orchestrator | Tuesday 06 January 2026 00:24:04 +0000 (0:00:00.518) 0:00:05.616 ******* 2026-01-06 00:24:39.012226 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.012235 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.012243 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.012250 | orchestrator | 2026-01-06 00:24:39.012257 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-01-06 00:24:39.012264 | orchestrator | Tuesday 06 January 2026 00:24:05 +0000 (0:00:01.056) 0:00:06.672 ******* 2026-01-06 00:24:39.012271 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.012285 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.012292 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.012299 | orchestrator | 2026-01-06 00:24:39.012307 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2026-01-06 00:24:39.012314 | orchestrator | Tuesday 06 January 2026 00:24:21 +0000 (0:00:15.875) 0:00:22.548 ******* 2026-01-06 00:24:39.012321 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:24:39.012328 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:24:39.012335 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:24:39.012342 | orchestrator | 2026-01-06 00:24:39.012350 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2026-01-06 00:24:39.012371 | orchestrator | Tuesday 06 January 2026 00:24:21 +0000 (0:00:00.092) 0:00:22.640 ******* 2026-01-06 00:24:39.012379 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:24:39.012386 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:24:39.012393 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:24:39.012400 | orchestrator | 2026-01-06 00:24:39.012408 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-01-06 00:24:39.012415 | orchestrator | Tuesday 06 January 2026 00:24:29 +0000 (0:00:07.789) 0:00:30.430 ******* 2026-01-06 00:24:39.012422 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.012429 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.012437 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.012444 | orchestrator | 2026-01-06 00:24:39.012451 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-01-06 00:24:39.012458 | orchestrator | Tuesday 06 January 2026 00:24:29 +0000 (0:00:00.518) 0:00:30.949 ******* 2026-01-06 00:24:39.012466 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2026-01-06 00:24:39.012473 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2026-01-06 00:24:39.012480 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2026-01-06 00:24:39.012488 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2026-01-06 00:24:39.012495 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2026-01-06 00:24:39.012502 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2026-01-06 00:24:39.012509 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2026-01-06 00:24:39.012516 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2026-01-06 00:24:39.012523 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2026-01-06 00:24:39.012531 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2026-01-06 00:24:39.012538 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2026-01-06 00:24:39.012546 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2026-01-06 00:24:39.012553 | orchestrator | 2026-01-06 00:24:39.012560 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-01-06 00:24:39.012568 | orchestrator | Tuesday 06 January 2026 00:24:33 +0000 (0:00:03.803) 0:00:34.752 ******* 2026-01-06 00:24:39.012575 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.012582 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.012590 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.012597 | orchestrator | 2026-01-06 00:24:39.012605 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-01-06 00:24:39.012612 | orchestrator | 2026-01-06 00:24:39.012620 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:24:39.012627 | orchestrator | Tuesday 06 January 2026 00:24:35 +0000 (0:00:01.447) 0:00:36.200 ******* 2026-01-06 00:24:39.012634 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:24:39.012641 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:24:39.012648 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:24:39.012655 | orchestrator | ok: [testbed-manager] 2026-01-06 00:24:39.012662 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:24:39.012675 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:24:39.012683 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:24:39.012690 | orchestrator | 2026-01-06 00:24:39.012732 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:24:39.012741 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:24:39.012750 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:24:39.012758 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:24:39.012766 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:24:39.012773 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:24:39.012781 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:24:39.012788 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:24:39.012795 | orchestrator | 2026-01-06 00:24:39.012803 | orchestrator | 2026-01-06 00:24:39.012810 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:24:39.012817 | orchestrator | Tuesday 06 January 2026 00:24:38 +0000 (0:00:03.778) 0:00:39.979 ******* 2026-01-06 00:24:39.012825 | orchestrator | =============================================================================== 2026-01-06 00:24:39.012832 | orchestrator | osism.commons.repository : Update package cache ------------------------ 15.88s 2026-01-06 00:24:39.012840 | orchestrator | Install required packages (Debian) -------------------------------------- 7.79s 2026-01-06 00:24:39.012847 | orchestrator | Copy fact files --------------------------------------------------------- 3.80s 2026-01-06 00:24:39.012854 | orchestrator | Gathers facts about hosts ----------------------------------------------- 3.78s 2026-01-06 00:24:39.012861 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.45s 2026-01-06 00:24:39.012869 | orchestrator | Create custom facts directory ------------------------------------------- 1.44s 2026-01-06 00:24:39.012880 | orchestrator | Copy fact file ---------------------------------------------------------- 1.22s 2026-01-06 00:24:39.284365 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 1.10s 2026-01-06 00:24:39.284480 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 1.06s 2026-01-06 00:24:39.284494 | orchestrator | Create custom facts directory ------------------------------------------- 0.52s 2026-01-06 00:24:39.284506 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.52s 2026-01-06 00:24:39.284517 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.46s 2026-01-06 00:24:39.284528 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.22s 2026-01-06 00:24:39.284538 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.20s 2026-01-06 00:24:39.284549 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.14s 2026-01-06 00:24:39.284561 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.13s 2026-01-06 00:24:39.284572 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.09s 2026-01-06 00:24:39.284582 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.09s 2026-01-06 00:24:39.581302 | orchestrator | + osism apply bootstrap 2026-01-06 00:24:51.770327 | orchestrator | 2026-01-06 00:24:51 | INFO  | Task 719bfb66-6a82-42bf-af06-32ffb1e809a9 (bootstrap) was prepared for execution. 2026-01-06 00:24:51.770505 | orchestrator | 2026-01-06 00:24:51 | INFO  | It takes a moment until task 719bfb66-6a82-42bf-af06-32ffb1e809a9 (bootstrap) has been started and output is visible here. 2026-01-06 00:25:07.323538 | orchestrator | 2026-01-06 00:25:07.323646 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2026-01-06 00:25:07.323663 | orchestrator | 2026-01-06 00:25:07.323690 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2026-01-06 00:25:07.323703 | orchestrator | Tuesday 06 January 2026 00:24:55 +0000 (0:00:00.112) 0:00:00.112 ******* 2026-01-06 00:25:07.323715 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:07.323727 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:07.323738 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:07.323749 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:07.323761 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:07.323772 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:07.323783 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:07.323794 | orchestrator | 2026-01-06 00:25:07.323806 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-01-06 00:25:07.323817 | orchestrator | 2026-01-06 00:25:07.323828 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:25:07.323840 | orchestrator | Tuesday 06 January 2026 00:24:56 +0000 (0:00:00.184) 0:00:00.296 ******* 2026-01-06 00:25:07.323851 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:07.323862 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:07.323874 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:07.323886 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:07.323897 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:07.323908 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:07.323919 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:07.323930 | orchestrator | 2026-01-06 00:25:07.323942 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2026-01-06 00:25:07.323953 | orchestrator | 2026-01-06 00:25:07.323964 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:25:07.323976 | orchestrator | Tuesday 06 January 2026 00:24:59 +0000 (0:00:03.716) 0:00:04.013 ******* 2026-01-06 00:25:07.323987 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-01-06 00:25:07.323999 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-01-06 00:25:07.324010 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2026-01-06 00:25:07.324021 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:25:07.324033 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-01-06 00:25:07.324044 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2026-01-06 00:25:07.324055 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:25:07.324093 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-01-06 00:25:07.324108 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:25:07.324122 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-01-06 00:25:07.324135 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2026-01-06 00:25:07.324148 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-01-06 00:25:07.324161 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2026-01-06 00:25:07.324174 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-01-06 00:25:07.324187 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-01-06 00:25:07.324200 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-01-06 00:25:07.324213 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-01-06 00:25:07.324227 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2026-01-06 00:25:07.324240 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:07.324254 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:07.324288 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2026-01-06 00:25:07.324302 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2026-01-06 00:25:07.324316 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2026-01-06 00:25:07.324329 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-01-06 00:25:07.324342 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2026-01-06 00:25:07.324354 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2026-01-06 00:25:07.324367 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-01-06 00:25:07.324379 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2026-01-06 00:25:07.324392 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2026-01-06 00:25:07.324404 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2026-01-06 00:25:07.324417 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-01-06 00:25:07.324430 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:07.324443 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2026-01-06 00:25:07.324455 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2026-01-06 00:25:07.324466 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2026-01-06 00:25:07.324476 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2026-01-06 00:25:07.324487 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-01-06 00:25:07.324498 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:25:07.324509 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2026-01-06 00:25:07.324520 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2026-01-06 00:25:07.324531 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-01-06 00:25:07.324541 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:25:07.324552 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2026-01-06 00:25:07.324563 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2026-01-06 00:25:07.324575 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-01-06 00:25:07.324586 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:07.324612 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-01-06 00:25:07.324624 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-01-06 00:25:07.324635 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:25:07.324647 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:07.324658 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-01-06 00:25:07.324668 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-01-06 00:25:07.324679 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-01-06 00:25:07.324690 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-01-06 00:25:07.324701 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:07.324712 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:07.324723 | orchestrator | 2026-01-06 00:25:07.324734 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2026-01-06 00:25:07.324745 | orchestrator | 2026-01-06 00:25:07.324757 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2026-01-06 00:25:07.324768 | orchestrator | Tuesday 06 January 2026 00:25:00 +0000 (0:00:00.408) 0:00:04.422 ******* 2026-01-06 00:25:07.324779 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:07.324790 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:07.324800 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:07.324811 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:07.324822 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:07.324833 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:07.324844 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:07.324855 | orchestrator | 2026-01-06 00:25:07.324866 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2026-01-06 00:25:07.324884 | orchestrator | Tuesday 06 January 2026 00:25:01 +0000 (0:00:01.237) 0:00:05.660 ******* 2026-01-06 00:25:07.324895 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:07.324906 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:07.324917 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:07.324928 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:07.324939 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:07.324950 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:07.324961 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:07.324972 | orchestrator | 2026-01-06 00:25:07.324983 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2026-01-06 00:25:07.324994 | orchestrator | Tuesday 06 January 2026 00:25:02 +0000 (0:00:01.186) 0:00:06.846 ******* 2026-01-06 00:25:07.325006 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:07.325019 | orchestrator | 2026-01-06 00:25:07.325030 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2026-01-06 00:25:07.325041 | orchestrator | Tuesday 06 January 2026 00:25:02 +0000 (0:00:00.261) 0:00:07.108 ******* 2026-01-06 00:25:07.325052 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:07.325063 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:07.325124 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:07.325136 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:07.325147 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:07.325158 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:07.325168 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:07.325179 | orchestrator | 2026-01-06 00:25:07.325191 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2026-01-06 00:25:07.325202 | orchestrator | Tuesday 06 January 2026 00:25:04 +0000 (0:00:02.028) 0:00:09.136 ******* 2026-01-06 00:25:07.325213 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:07.325225 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:07.325238 | orchestrator | 2026-01-06 00:25:07.325249 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2026-01-06 00:25:07.325261 | orchestrator | Tuesday 06 January 2026 00:25:05 +0000 (0:00:00.246) 0:00:09.382 ******* 2026-01-06 00:25:07.325272 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:07.325283 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:07.325294 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:07.325305 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:07.325315 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:07.325326 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:07.325337 | orchestrator | 2026-01-06 00:25:07.325348 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2026-01-06 00:25:07.325359 | orchestrator | Tuesday 06 January 2026 00:25:06 +0000 (0:00:00.950) 0:00:10.333 ******* 2026-01-06 00:25:07.325378 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:07.325390 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:07.325401 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:07.325412 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:07.325423 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:07.325433 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:07.325444 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:07.325455 | orchestrator | 2026-01-06 00:25:07.325466 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2026-01-06 00:25:07.325477 | orchestrator | Tuesday 06 January 2026 00:25:06 +0000 (0:00:00.596) 0:00:10.929 ******* 2026-01-06 00:25:07.325488 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:07.325506 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:07.325517 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:07.325528 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:07.325539 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:07.325550 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:07.325561 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:07.325572 | orchestrator | 2026-01-06 00:25:07.325583 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-01-06 00:25:07.325595 | orchestrator | Tuesday 06 January 2026 00:25:07 +0000 (0:00:00.418) 0:00:11.348 ******* 2026-01-06 00:25:07.325606 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:07.325616 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:07.325635 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:19.023881 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:19.024063 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:19.024159 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:19.024179 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:19.024198 | orchestrator | 2026-01-06 00:25:19.024221 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-01-06 00:25:19.024243 | orchestrator | Tuesday 06 January 2026 00:25:07 +0000 (0:00:00.263) 0:00:11.611 ******* 2026-01-06 00:25:19.024265 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:19.024301 | orchestrator | 2026-01-06 00:25:19.024322 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-01-06 00:25:19.024344 | orchestrator | Tuesday 06 January 2026 00:25:07 +0000 (0:00:00.301) 0:00:11.912 ******* 2026-01-06 00:25:19.024367 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:19.024389 | orchestrator | 2026-01-06 00:25:19.024412 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-01-06 00:25:19.024434 | orchestrator | Tuesday 06 January 2026 00:25:08 +0000 (0:00:00.316) 0:00:12.229 ******* 2026-01-06 00:25:19.024456 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.024479 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.024501 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.024523 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.024545 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.024564 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.024586 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.024610 | orchestrator | 2026-01-06 00:25:19.024631 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-01-06 00:25:19.024652 | orchestrator | Tuesday 06 January 2026 00:25:09 +0000 (0:00:01.283) 0:00:13.512 ******* 2026-01-06 00:25:19.024671 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:19.024693 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:19.024714 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:19.024732 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:19.024750 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:19.024768 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:19.024786 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:19.024803 | orchestrator | 2026-01-06 00:25:19.024821 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-01-06 00:25:19.024838 | orchestrator | Tuesday 06 January 2026 00:25:09 +0000 (0:00:00.208) 0:00:13.721 ******* 2026-01-06 00:25:19.024857 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.024874 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.024892 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.024911 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.024930 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.024983 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.025004 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.025022 | orchestrator | 2026-01-06 00:25:19.025042 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-01-06 00:25:19.025061 | orchestrator | Tuesday 06 January 2026 00:25:10 +0000 (0:00:00.506) 0:00:14.227 ******* 2026-01-06 00:25:19.025110 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:19.025129 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:19.025149 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:19.025167 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:19.025183 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:19.025199 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:19.025214 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:19.025230 | orchestrator | 2026-01-06 00:25:19.025248 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-01-06 00:25:19.025269 | orchestrator | Tuesday 06 January 2026 00:25:10 +0000 (0:00:00.233) 0:00:14.461 ******* 2026-01-06 00:25:19.025288 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.025306 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:19.025325 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:19.025336 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:19.025347 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:19.025357 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:19.025368 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:19.025379 | orchestrator | 2026-01-06 00:25:19.025390 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-01-06 00:25:19.025400 | orchestrator | Tuesday 06 January 2026 00:25:10 +0000 (0:00:00.630) 0:00:15.091 ******* 2026-01-06 00:25:19.025411 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.025422 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:19.025432 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:19.025443 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:19.025454 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:19.025464 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:19.025475 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:19.025485 | orchestrator | 2026-01-06 00:25:19.025496 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-01-06 00:25:19.025507 | orchestrator | Tuesday 06 January 2026 00:25:11 +0000 (0:00:01.030) 0:00:16.121 ******* 2026-01-06 00:25:19.025518 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.025528 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.025539 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.025550 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.025561 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.025571 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.025581 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.025592 | orchestrator | 2026-01-06 00:25:19.025603 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-01-06 00:25:19.025614 | orchestrator | Tuesday 06 January 2026 00:25:12 +0000 (0:00:01.050) 0:00:17.172 ******* 2026-01-06 00:25:19.025662 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:19.025675 | orchestrator | 2026-01-06 00:25:19.025687 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-01-06 00:25:19.025697 | orchestrator | Tuesday 06 January 2026 00:25:13 +0000 (0:00:00.259) 0:00:17.432 ******* 2026-01-06 00:25:19.025708 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:19.025719 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:19.025729 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:19.025740 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:19.025751 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:19.025774 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:19.025785 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:19.025795 | orchestrator | 2026-01-06 00:25:19.025806 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-01-06 00:25:19.025817 | orchestrator | Tuesday 06 January 2026 00:25:14 +0000 (0:00:01.157) 0:00:18.589 ******* 2026-01-06 00:25:19.025828 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.025839 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.025849 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.025860 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.025871 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.025882 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.025892 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.025903 | orchestrator | 2026-01-06 00:25:19.025914 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-01-06 00:25:19.025925 | orchestrator | Tuesday 06 January 2026 00:25:14 +0000 (0:00:00.238) 0:00:18.827 ******* 2026-01-06 00:25:19.025936 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.025947 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.025958 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.025968 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.025979 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.025990 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.026001 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.026011 | orchestrator | 2026-01-06 00:25:19.026139 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-01-06 00:25:19.026160 | orchestrator | Tuesday 06 January 2026 00:25:14 +0000 (0:00:00.242) 0:00:19.070 ******* 2026-01-06 00:25:19.026176 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.026187 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.026197 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.026248 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.026259 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.026270 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.026280 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.026291 | orchestrator | 2026-01-06 00:25:19.026302 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-01-06 00:25:19.026313 | orchestrator | Tuesday 06 January 2026 00:25:15 +0000 (0:00:00.215) 0:00:19.285 ******* 2026-01-06 00:25:19.026325 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:19.026339 | orchestrator | 2026-01-06 00:25:19.026350 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-01-06 00:25:19.026360 | orchestrator | Tuesday 06 January 2026 00:25:15 +0000 (0:00:00.335) 0:00:19.621 ******* 2026-01-06 00:25:19.026387 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.026406 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.026417 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.026428 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.026439 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.026450 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.026461 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.026471 | orchestrator | 2026-01-06 00:25:19.026482 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-01-06 00:25:19.026493 | orchestrator | Tuesday 06 January 2026 00:25:15 +0000 (0:00:00.543) 0:00:20.164 ******* 2026-01-06 00:25:19.026504 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:19.026515 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:19.026525 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:19.026536 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:19.026547 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:19.026557 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:19.026568 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:19.026589 | orchestrator | 2026-01-06 00:25:19.026600 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-01-06 00:25:19.026614 | orchestrator | Tuesday 06 January 2026 00:25:16 +0000 (0:00:00.213) 0:00:20.378 ******* 2026-01-06 00:25:19.026630 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.026641 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.026652 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.026663 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.026673 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:19.026684 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:19.026695 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:19.026705 | orchestrator | 2026-01-06 00:25:19.026716 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-01-06 00:25:19.026727 | orchestrator | Tuesday 06 January 2026 00:25:17 +0000 (0:00:01.119) 0:00:21.498 ******* 2026-01-06 00:25:19.026738 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.026748 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.026759 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.026770 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.026780 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:19.026791 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:19.026802 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:19.026813 | orchestrator | 2026-01-06 00:25:19.026823 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-01-06 00:25:19.026834 | orchestrator | Tuesday 06 January 2026 00:25:17 +0000 (0:00:00.564) 0:00:22.062 ******* 2026-01-06 00:25:19.026845 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:19.026855 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:19.026866 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:19.026877 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:19.026899 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.376650 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.379294 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.379326 | orchestrator | 2026-01-06 00:25:59.379338 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-01-06 00:25:59.379364 | orchestrator | Tuesday 06 January 2026 00:25:19 +0000 (0:00:01.135) 0:00:23.197 ******* 2026-01-06 00:25:59.379372 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379390 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379398 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379406 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:59.379414 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.379421 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.379429 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.379437 | orchestrator | 2026-01-06 00:25:59.379444 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2026-01-06 00:25:59.379452 | orchestrator | Tuesday 06 January 2026 00:25:34 +0000 (0:00:15.860) 0:00:39.058 ******* 2026-01-06 00:25:59.379460 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.379468 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379476 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379483 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379490 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.379497 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.379505 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.379512 | orchestrator | 2026-01-06 00:25:59.379519 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2026-01-06 00:25:59.379527 | orchestrator | Tuesday 06 January 2026 00:25:35 +0000 (0:00:00.226) 0:00:39.285 ******* 2026-01-06 00:25:59.379534 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.379540 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379548 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379555 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379563 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.379571 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.379579 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.379612 | orchestrator | 2026-01-06 00:25:59.379620 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2026-01-06 00:25:59.379628 | orchestrator | Tuesday 06 January 2026 00:25:35 +0000 (0:00:00.221) 0:00:39.506 ******* 2026-01-06 00:25:59.379636 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.379643 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379650 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379657 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379665 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.379672 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.379679 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.379686 | orchestrator | 2026-01-06 00:25:59.379694 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2026-01-06 00:25:59.379702 | orchestrator | Tuesday 06 January 2026 00:25:35 +0000 (0:00:00.232) 0:00:39.738 ******* 2026-01-06 00:25:59.379730 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:59.379741 | orchestrator | 2026-01-06 00:25:59.379749 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2026-01-06 00:25:59.379756 | orchestrator | Tuesday 06 January 2026 00:25:35 +0000 (0:00:00.299) 0:00:40.037 ******* 2026-01-06 00:25:59.379763 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.379771 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379778 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.379785 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379793 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.379800 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.379808 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379815 | orchestrator | 2026-01-06 00:25:59.379823 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2026-01-06 00:25:59.379849 | orchestrator | Tuesday 06 January 2026 00:25:37 +0000 (0:00:01.701) 0:00:41.739 ******* 2026-01-06 00:25:59.379856 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:59.379863 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:59.379871 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:59.379879 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:59.379886 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.379893 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.379901 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.379908 | orchestrator | 2026-01-06 00:25:59.379916 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2026-01-06 00:25:59.379924 | orchestrator | Tuesday 06 January 2026 00:25:38 +0000 (0:00:01.084) 0:00:42.824 ******* 2026-01-06 00:25:59.379931 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.379939 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.379946 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.379954 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.379962 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.379969 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.379976 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.379984 | orchestrator | 2026-01-06 00:25:59.379991 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2026-01-06 00:25:59.379999 | orchestrator | Tuesday 06 January 2026 00:25:39 +0000 (0:00:00.797) 0:00:43.622 ******* 2026-01-06 00:25:59.380008 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:59.380027 | orchestrator | 2026-01-06 00:25:59.380035 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2026-01-06 00:25:59.380054 | orchestrator | Tuesday 06 January 2026 00:25:39 +0000 (0:00:00.289) 0:00:43.912 ******* 2026-01-06 00:25:59.380061 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:59.380076 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:59.380135 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:59.380145 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:59.380153 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.380160 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.380168 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.380176 | orchestrator | 2026-01-06 00:25:59.380237 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2026-01-06 00:25:59.380246 | orchestrator | Tuesday 06 January 2026 00:25:40 +0000 (0:00:01.075) 0:00:44.987 ******* 2026-01-06 00:25:59.380254 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:25:59.380261 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:25:59.380269 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:25:59.380276 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:25:59.380284 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:25:59.380291 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:25:59.380299 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:25:59.380306 | orchestrator | 2026-01-06 00:25:59.380313 | orchestrator | TASK [osism.services.rsyslog : Include logrotate tasks] ************************ 2026-01-06 00:25:59.380321 | orchestrator | Tuesday 06 January 2026 00:25:41 +0000 (0:00:00.272) 0:00:45.260 ******* 2026-01-06 00:25:59.380328 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/logrotate.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:59.380335 | orchestrator | 2026-01-06 00:25:59.380343 | orchestrator | TASK [osism.services.rsyslog : Ensure logrotate package is installed] ********** 2026-01-06 00:25:59.380351 | orchestrator | Tuesday 06 January 2026 00:25:41 +0000 (0:00:00.274) 0:00:45.535 ******* 2026-01-06 00:25:59.380358 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.380366 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.380373 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.380381 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.380388 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.380396 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.380403 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.380411 | orchestrator | 2026-01-06 00:25:59.380418 | orchestrator | TASK [osism.services.rsyslog : Configure logrotate for rsyslog] **************** 2026-01-06 00:25:59.380425 | orchestrator | Tuesday 06 January 2026 00:25:43 +0000 (0:00:01.665) 0:00:47.201 ******* 2026-01-06 00:25:59.380432 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:59.380440 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:59.380447 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:59.380454 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.380462 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:59.380469 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.380476 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.380484 | orchestrator | 2026-01-06 00:25:59.380491 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2026-01-06 00:25:59.380498 | orchestrator | Tuesday 06 January 2026 00:25:44 +0000 (0:00:01.097) 0:00:48.298 ******* 2026-01-06 00:25:59.380505 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:25:59.380513 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:25:59.380520 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:25:59.380527 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:25:59.380535 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:25:59.380542 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:25:59.380550 | orchestrator | changed: [testbed-manager] 2026-01-06 00:25:59.380557 | orchestrator | 2026-01-06 00:25:59.380564 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2026-01-06 00:25:59.380572 | orchestrator | Tuesday 06 January 2026 00:25:56 +0000 (0:00:12.640) 0:01:00.939 ******* 2026-01-06 00:25:59.380579 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.380594 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.380602 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.380609 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.380616 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.380623 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.380631 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.380638 | orchestrator | 2026-01-06 00:25:59.380645 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2026-01-06 00:25:59.380653 | orchestrator | Tuesday 06 January 2026 00:25:57 +0000 (0:00:00.900) 0:01:01.839 ******* 2026-01-06 00:25:59.380660 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.380668 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.380676 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.380683 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.380690 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.380697 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.380705 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.380713 | orchestrator | 2026-01-06 00:25:59.380720 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2026-01-06 00:25:59.380728 | orchestrator | Tuesday 06 January 2026 00:25:58 +0000 (0:00:00.902) 0:01:02.742 ******* 2026-01-06 00:25:59.380735 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.380743 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.380750 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.380757 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.380765 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.380772 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.380780 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.380787 | orchestrator | 2026-01-06 00:25:59.380795 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2026-01-06 00:25:59.380803 | orchestrator | Tuesday 06 January 2026 00:25:58 +0000 (0:00:00.251) 0:01:02.993 ******* 2026-01-06 00:25:59.380811 | orchestrator | ok: [testbed-manager] 2026-01-06 00:25:59.380818 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:25:59.380825 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:25:59.380833 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:25:59.380840 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:25:59.380847 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:25:59.380854 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:25:59.380862 | orchestrator | 2026-01-06 00:25:59.380869 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2026-01-06 00:25:59.380876 | orchestrator | Tuesday 06 January 2026 00:25:59 +0000 (0:00:00.243) 0:01:03.236 ******* 2026-01-06 00:25:59.380884 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:25:59.380893 | orchestrator | 2026-01-06 00:25:59.380929 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2026-01-06 00:28:26.007708 | orchestrator | Tuesday 06 January 2026 00:25:59 +0000 (0:00:00.313) 0:01:03.550 ******* 2026-01-06 00:28:26.009334 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.009358 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.009368 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.009379 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.009389 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.009399 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.009409 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.009419 | orchestrator | 2026-01-06 00:28:26.009430 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2026-01-06 00:28:26.009441 | orchestrator | Tuesday 06 January 2026 00:26:01 +0000 (0:00:01.831) 0:01:05.382 ******* 2026-01-06 00:28:26.009451 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:26.009475 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:26.009485 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:26.009495 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:26.009548 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:26.009559 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:26.009569 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:26.009578 | orchestrator | 2026-01-06 00:28:26.009588 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2026-01-06 00:28:26.009600 | orchestrator | Tuesday 06 January 2026 00:26:01 +0000 (0:00:00.572) 0:01:05.955 ******* 2026-01-06 00:28:26.009610 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.009619 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.009629 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.009639 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.009648 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.009658 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.009668 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.009677 | orchestrator | 2026-01-06 00:28:26.009687 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2026-01-06 00:28:26.009697 | orchestrator | Tuesday 06 January 2026 00:26:01 +0000 (0:00:00.227) 0:01:06.182 ******* 2026-01-06 00:28:26.009707 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.009717 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.009726 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.009736 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.009746 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.009755 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.009765 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.009775 | orchestrator | 2026-01-06 00:28:26.009785 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2026-01-06 00:28:26.009794 | orchestrator | Tuesday 06 January 2026 00:26:03 +0000 (0:00:01.377) 0:01:07.560 ******* 2026-01-06 00:28:26.009804 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:26.009814 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:26.009823 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:26.009833 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:26.009843 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:26.009852 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:26.009862 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:26.009872 | orchestrator | 2026-01-06 00:28:26.009882 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2026-01-06 00:28:26.009891 | orchestrator | Tuesday 06 January 2026 00:26:05 +0000 (0:00:01.867) 0:01:09.427 ******* 2026-01-06 00:28:26.009901 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.009911 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.009920 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.009930 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.009940 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.009950 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.009959 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.009969 | orchestrator | 2026-01-06 00:28:26.009979 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2026-01-06 00:28:26.009989 | orchestrator | Tuesday 06 January 2026 00:26:07 +0000 (0:00:02.383) 0:01:11.810 ******* 2026-01-06 00:28:26.009998 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.010008 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.010052 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.010064 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.010073 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.010083 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.010093 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.010102 | orchestrator | 2026-01-06 00:28:26.010113 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2026-01-06 00:28:26.010123 | orchestrator | Tuesday 06 January 2026 00:26:48 +0000 (0:00:41.027) 0:01:52.837 ******* 2026-01-06 00:28:26.010165 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:26.010182 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:26.010200 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:26.010228 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:26.010238 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:26.010248 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:26.010258 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:26.010267 | orchestrator | 2026-01-06 00:28:26.010277 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2026-01-06 00:28:26.010287 | orchestrator | Tuesday 06 January 2026 00:28:10 +0000 (0:01:21.466) 0:03:14.304 ******* 2026-01-06 00:28:26.010297 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.010306 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:26.010316 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.010325 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.010335 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.010345 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.010354 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.010364 | orchestrator | 2026-01-06 00:28:26.010374 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2026-01-06 00:28:26.010384 | orchestrator | Tuesday 06 January 2026 00:28:12 +0000 (0:00:01.999) 0:03:16.303 ******* 2026-01-06 00:28:26.010394 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:26.010403 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:26.010413 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:26.010423 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:26.010432 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:26.010442 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:26.010452 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:26.010461 | orchestrator | 2026-01-06 00:28:26.010471 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2026-01-06 00:28:26.010481 | orchestrator | Tuesday 06 January 2026 00:28:24 +0000 (0:00:12.557) 0:03:28.861 ******* 2026-01-06 00:28:26.010557 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2026-01-06 00:28:26.010632 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2026-01-06 00:28:26.010653 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2026-01-06 00:28:26.010698 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-01-06 00:28:26.010715 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'network', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-01-06 00:28:26.010744 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2026-01-06 00:28:26.010761 | orchestrator | 2026-01-06 00:28:26.010796 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2026-01-06 00:28:26.010808 | orchestrator | Tuesday 06 January 2026 00:28:25 +0000 (0:00:00.472) 0:03:29.334 ******* 2026-01-06 00:28:26.010818 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-01-06 00:28:26.010828 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:26.010838 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-01-06 00:28:26.010847 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:26.010857 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-01-06 00:28:26.010866 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-01-06 00:28:26.010876 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:26.010886 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:26.010896 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:28:26.010905 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:28:26.010915 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:28:26.010925 | orchestrator | 2026-01-06 00:28:26.010934 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2026-01-06 00:28:26.010954 | orchestrator | Tuesday 06 January 2026 00:28:25 +0000 (0:00:00.733) 0:03:30.068 ******* 2026-01-06 00:28:26.010964 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-01-06 00:28:26.010987 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-01-06 00:28:26.010997 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-01-06 00:28:26.011007 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-01-06 00:28:26.011017 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-01-06 00:28:26.011040 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-01-06 00:28:31.788881 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-01-06 00:28:31.789040 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-01-06 00:28:31.789057 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-01-06 00:28:31.789069 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-01-06 00:28:31.789082 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-01-06 00:28:31.789093 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-01-06 00:28:31.789104 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-01-06 00:28:31.789115 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-01-06 00:28:31.789126 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-01-06 00:28:31.789180 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-01-06 00:28:31.789248 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-01-06 00:28:31.789261 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:31.789273 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-01-06 00:28:31.789284 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-01-06 00:28:31.789295 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-01-06 00:28:31.789305 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-01-06 00:28:31.789316 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-01-06 00:28:31.789327 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-01-06 00:28:31.789338 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:31.789349 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-01-06 00:28:31.789360 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-01-06 00:28:31.789370 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-01-06 00:28:31.789381 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-01-06 00:28:31.789392 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-01-06 00:28:31.789403 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-01-06 00:28:31.789413 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-01-06 00:28:31.789424 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-01-06 00:28:31.789435 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:31.789446 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-01-06 00:28:31.789456 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-01-06 00:28:31.789467 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-01-06 00:28:31.789477 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-01-06 00:28:31.789488 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-01-06 00:28:31.789499 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-01-06 00:28:31.789509 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-01-06 00:28:31.789520 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-01-06 00:28:31.789530 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-01-06 00:28:31.789541 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:31.789552 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-01-06 00:28:31.789563 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-01-06 00:28:31.789573 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-01-06 00:28:31.789584 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-01-06 00:28:31.789595 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-01-06 00:28:31.789646 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-01-06 00:28:31.789667 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-01-06 00:28:31.789678 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-01-06 00:28:31.789689 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-01-06 00:28:31.789700 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789719 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789738 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789759 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789779 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789795 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-01-06 00:28:31.789805 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-01-06 00:28:31.789816 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-01-06 00:28:31.789826 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-01-06 00:28:31.789837 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-01-06 00:28:31.789848 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-01-06 00:28:31.789859 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-01-06 00:28:31.789872 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-01-06 00:28:31.789891 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-01-06 00:28:31.789910 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-01-06 00:28:31.789929 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-01-06 00:28:31.789947 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-01-06 00:28:31.789966 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-01-06 00:28:31.789985 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-01-06 00:28:31.790002 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-01-06 00:28:31.790013 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-01-06 00:28:31.790095 | orchestrator | 2026-01-06 00:28:31.790118 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2026-01-06 00:28:31.790163 | orchestrator | Tuesday 06 January 2026 00:28:30 +0000 (0:00:04.803) 0:03:34.871 ******* 2026-01-06 00:28:31.790175 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790186 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790197 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790207 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790218 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790229 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790239 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-01-06 00:28:31.790260 | orchestrator | 2026-01-06 00:28:31.790271 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2026-01-06 00:28:31.790282 | orchestrator | Tuesday 06 January 2026 00:28:31 +0000 (0:00:00.579) 0:03:35.451 ******* 2026-01-06 00:28:31.790292 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:31.790303 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:31.790314 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:31.790332 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:31.790351 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:28:31.790446 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:28:31.790466 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:31.790483 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:28:31.790495 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:31.790513 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:31.790537 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:46.737281 | orchestrator | 2026-01-06 00:28:46.737410 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on network] ***************** 2026-01-06 00:28:46.737428 | orchestrator | Tuesday 06 January 2026 00:28:31 +0000 (0:00:00.510) 0:03:35.961 ******* 2026-01-06 00:28:46.737441 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:46.737454 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:46.737465 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:46.737477 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:46.737489 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:46.737500 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-01-06 00:28:46.737511 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:46.737522 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:46.737532 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:46.737543 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:46.737554 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-01-06 00:28:46.737565 | orchestrator | 2026-01-06 00:28:46.737576 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2026-01-06 00:28:46.737587 | orchestrator | Tuesday 06 January 2026 00:28:32 +0000 (0:00:00.661) 0:03:36.623 ******* 2026-01-06 00:28:46.737598 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-01-06 00:28:46.737608 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:46.737623 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-01-06 00:28:46.737648 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-01-06 00:28:46.737677 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:28:46.737695 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:28:46.737714 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-01-06 00:28:46.737733 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:28:46.737753 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-01-06 00:28:46.737813 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-01-06 00:28:46.737834 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-01-06 00:28:46.737854 | orchestrator | 2026-01-06 00:28:46.737874 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2026-01-06 00:28:46.737893 | orchestrator | Tuesday 06 January 2026 00:28:33 +0000 (0:00:00.632) 0:03:37.255 ******* 2026-01-06 00:28:46.737913 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:46.737933 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:46.737953 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:46.737974 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:46.737993 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:28:46.738013 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:28:46.738088 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:28:46.738110 | orchestrator | 2026-01-06 00:28:46.738129 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2026-01-06 00:28:46.738179 | orchestrator | Tuesday 06 January 2026 00:28:33 +0000 (0:00:00.338) 0:03:37.594 ******* 2026-01-06 00:28:46.738197 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:46.738215 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:46.738231 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:46.738248 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:46.738265 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:46.738282 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:46.738300 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:46.738318 | orchestrator | 2026-01-06 00:28:46.738337 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2026-01-06 00:28:46.738355 | orchestrator | Tuesday 06 January 2026 00:28:39 +0000 (0:00:05.968) 0:03:43.562 ******* 2026-01-06 00:28:46.738374 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2026-01-06 00:28:46.738393 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2026-01-06 00:28:46.738413 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:46.738431 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2026-01-06 00:28:46.738449 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:46.738468 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2026-01-06 00:28:46.738486 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:46.738504 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2026-01-06 00:28:46.738523 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:46.738541 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2026-01-06 00:28:46.738559 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:28:46.738578 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:28:46.738596 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2026-01-06 00:28:46.738615 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:28:46.738634 | orchestrator | 2026-01-06 00:28:46.738652 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2026-01-06 00:28:46.738671 | orchestrator | Tuesday 06 January 2026 00:28:39 +0000 (0:00:00.359) 0:03:43.922 ******* 2026-01-06 00:28:46.738690 | orchestrator | ok: [testbed-manager] => (item=cron) 2026-01-06 00:28:46.738709 | orchestrator | ok: [testbed-node-3] => (item=cron) 2026-01-06 00:28:46.738728 | orchestrator | ok: [testbed-node-4] => (item=cron) 2026-01-06 00:28:46.738772 | orchestrator | ok: [testbed-node-5] => (item=cron) 2026-01-06 00:28:46.738791 | orchestrator | ok: [testbed-node-0] => (item=cron) 2026-01-06 00:28:46.738808 | orchestrator | ok: [testbed-node-1] => (item=cron) 2026-01-06 00:28:46.738827 | orchestrator | ok: [testbed-node-2] => (item=cron) 2026-01-06 00:28:46.738845 | orchestrator | 2026-01-06 00:28:46.738862 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2026-01-06 00:28:46.738874 | orchestrator | Tuesday 06 January 2026 00:28:40 +0000 (0:00:01.053) 0:03:44.975 ******* 2026-01-06 00:28:46.738936 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:28:46.738978 | orchestrator | 2026-01-06 00:28:46.738990 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2026-01-06 00:28:46.739001 | orchestrator | Tuesday 06 January 2026 00:28:41 +0000 (0:00:00.581) 0:03:45.556 ******* 2026-01-06 00:28:46.739012 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:46.739023 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:46.739033 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:46.739044 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:46.739055 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:46.739066 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:46.739076 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:46.739087 | orchestrator | 2026-01-06 00:28:46.739098 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2026-01-06 00:28:46.739109 | orchestrator | Tuesday 06 January 2026 00:28:42 +0000 (0:00:01.464) 0:03:47.021 ******* 2026-01-06 00:28:46.739119 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:46.739130 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:46.739171 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:46.739182 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:46.739192 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:46.739203 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:46.739213 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:46.739224 | orchestrator | 2026-01-06 00:28:46.739235 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2026-01-06 00:28:46.739252 | orchestrator | Tuesday 06 January 2026 00:28:44 +0000 (0:00:01.393) 0:03:48.415 ******* 2026-01-06 00:28:46.739270 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:46.739289 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:46.739307 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:46.739325 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:46.739344 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:46.739364 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:46.739382 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:46.739400 | orchestrator | 2026-01-06 00:28:46.739436 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2026-01-06 00:28:46.739455 | orchestrator | Tuesday 06 January 2026 00:28:44 +0000 (0:00:00.658) 0:03:49.075 ******* 2026-01-06 00:28:46.739472 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:46.739488 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:46.739504 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:46.739521 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:46.739539 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:46.739554 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:46.739572 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:46.739590 | orchestrator | 2026-01-06 00:28:46.739608 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2026-01-06 00:28:46.739625 | orchestrator | Tuesday 06 January 2026 00:28:45 +0000 (0:00:00.757) 0:03:49.833 ******* 2026-01-06 00:28:46.739647 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657896.3393033, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:46.739669 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657909.20879, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:46.739711 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657905.938133, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:46.739799 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657899.8904555, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681733 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657898.9308984, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681862 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657903.4009075, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681879 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1767657909.3468742, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681892 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681924 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681969 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.681999 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.682176 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.682197 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.682208 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-01-06 00:28:51.682221 | orchestrator | 2026-01-06 00:28:51.682237 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2026-01-06 00:28:51.682252 | orchestrator | Tuesday 06 January 2026 00:28:46 +0000 (0:00:01.080) 0:03:50.913 ******* 2026-01-06 00:28:51.682266 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:51.682280 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:51.682292 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:51.682305 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:51.682318 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:51.682330 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:51.682344 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:51.682356 | orchestrator | 2026-01-06 00:28:51.682369 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2026-01-06 00:28:51.682390 | orchestrator | Tuesday 06 January 2026 00:28:47 +0000 (0:00:01.127) 0:03:52.041 ******* 2026-01-06 00:28:51.682401 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:51.682412 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:51.682423 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:51.682434 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:51.682445 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:51.682456 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:51.682467 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:51.682478 | orchestrator | 2026-01-06 00:28:51.682488 | orchestrator | TASK [osism.commons.motd : Copy issue.net file] ******************************** 2026-01-06 00:28:51.682499 | orchestrator | Tuesday 06 January 2026 00:28:48 +0000 (0:00:01.129) 0:03:53.170 ******* 2026-01-06 00:28:51.682510 | orchestrator | changed: [testbed-manager] 2026-01-06 00:28:51.682521 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:28:51.682532 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:28:51.682543 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:28:51.682554 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:28:51.682564 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:28:51.682575 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:28:51.682586 | orchestrator | 2026-01-06 00:28:51.682597 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2026-01-06 00:28:51.682608 | orchestrator | Tuesday 06 January 2026 00:28:50 +0000 (0:00:01.167) 0:03:54.337 ******* 2026-01-06 00:28:51.682619 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:28:51.682630 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:28:51.682641 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:28:51.682652 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:28:51.682662 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:28:51.682673 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:28:51.682684 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:28:51.682695 | orchestrator | 2026-01-06 00:28:51.682706 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2026-01-06 00:28:51.682717 | orchestrator | Tuesday 06 January 2026 00:28:50 +0000 (0:00:00.313) 0:03:54.651 ******* 2026-01-06 00:28:51.682728 | orchestrator | ok: [testbed-manager] 2026-01-06 00:28:51.682740 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:28:51.682751 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:28:51.682762 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:28:51.682780 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:28:51.682792 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:28:51.682802 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:28:51.682813 | orchestrator | 2026-01-06 00:28:51.682824 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2026-01-06 00:28:51.682835 | orchestrator | Tuesday 06 January 2026 00:28:51 +0000 (0:00:00.751) 0:03:55.403 ******* 2026-01-06 00:28:51.682848 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:28:51.682862 | orchestrator | 2026-01-06 00:28:51.682873 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2026-01-06 00:28:51.682892 | orchestrator | Tuesday 06 January 2026 00:28:51 +0000 (0:00:00.456) 0:03:55.859 ******* 2026-01-06 00:30:11.931411 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932394 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:11.932411 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:11.932420 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:11.932428 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:11.932436 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:11.932444 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:11.932452 | orchestrator | 2026-01-06 00:30:11.932462 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2026-01-06 00:30:11.932512 | orchestrator | Tuesday 06 January 2026 00:29:00 +0000 (0:00:08.684) 0:04:04.544 ******* 2026-01-06 00:30:11.932522 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932530 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932538 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932546 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932554 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.932562 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.932569 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.932577 | orchestrator | 2026-01-06 00:30:11.932586 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2026-01-06 00:30:11.932594 | orchestrator | Tuesday 06 January 2026 00:29:01 +0000 (0:00:01.232) 0:04:05.776 ******* 2026-01-06 00:30:11.932602 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932609 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932617 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932625 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.932633 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932641 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.932649 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.932657 | orchestrator | 2026-01-06 00:30:11.932665 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2026-01-06 00:30:11.932708 | orchestrator | Tuesday 06 January 2026 00:29:02 +0000 (0:00:01.142) 0:04:06.919 ******* 2026-01-06 00:30:11.932717 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932725 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932733 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932741 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.932749 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932756 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.932764 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.932772 | orchestrator | 2026-01-06 00:30:11.932780 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2026-01-06 00:30:11.932798 | orchestrator | Tuesday 06 January 2026 00:29:03 +0000 (0:00:00.295) 0:04:07.215 ******* 2026-01-06 00:30:11.932807 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932814 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932822 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932830 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.932838 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932846 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.932854 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.932861 | orchestrator | 2026-01-06 00:30:11.932869 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2026-01-06 00:30:11.932878 | orchestrator | Tuesday 06 January 2026 00:29:03 +0000 (0:00:00.310) 0:04:07.525 ******* 2026-01-06 00:30:11.932886 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932893 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932901 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932909 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.932917 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932925 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.932932 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.932940 | orchestrator | 2026-01-06 00:30:11.932948 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2026-01-06 00:30:11.932956 | orchestrator | Tuesday 06 January 2026 00:29:03 +0000 (0:00:00.310) 0:04:07.836 ******* 2026-01-06 00:30:11.932964 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.932972 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.932980 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.932988 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.932996 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.933004 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.933011 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.933019 | orchestrator | 2026-01-06 00:30:11.933027 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2026-01-06 00:30:11.933050 | orchestrator | Tuesday 06 January 2026 00:29:09 +0000 (0:00:05.946) 0:04:13.782 ******* 2026-01-06 00:30:11.933872 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:30:11.933897 | orchestrator | 2026-01-06 00:30:11.933906 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2026-01-06 00:30:11.933914 | orchestrator | Tuesday 06 January 2026 00:29:10 +0000 (0:00:00.410) 0:04:14.193 ******* 2026-01-06 00:30:11.933922 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.933930 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2026-01-06 00:30:11.933937 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.933944 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2026-01-06 00:30:11.933951 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:11.933958 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.933965 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2026-01-06 00:30:11.933972 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:11.933978 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.933985 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2026-01-06 00:30:11.933992 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:11.933998 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.934005 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2026-01-06 00:30:11.934012 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:11.934045 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.934052 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2026-01-06 00:30:11.934079 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:11.934087 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:11.934093 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2026-01-06 00:30:11.934100 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2026-01-06 00:30:11.934107 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:11.934114 | orchestrator | 2026-01-06 00:30:11.934121 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2026-01-06 00:30:11.934127 | orchestrator | Tuesday 06 January 2026 00:29:10 +0000 (0:00:00.391) 0:04:14.584 ******* 2026-01-06 00:30:11.934134 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:30:11.934141 | orchestrator | 2026-01-06 00:30:11.934166 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2026-01-06 00:30:11.934174 | orchestrator | Tuesday 06 January 2026 00:29:10 +0000 (0:00:00.405) 0:04:14.990 ******* 2026-01-06 00:30:11.934180 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2026-01-06 00:30:11.934187 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2026-01-06 00:30:11.934194 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:11.934201 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2026-01-06 00:30:11.934207 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:11.934214 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2026-01-06 00:30:11.934221 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:11.934227 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2026-01-06 00:30:11.934234 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:11.934240 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2026-01-06 00:30:11.934247 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:11.934264 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:11.934270 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2026-01-06 00:30:11.934277 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:11.934283 | orchestrator | 2026-01-06 00:30:11.934302 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2026-01-06 00:30:11.934309 | orchestrator | Tuesday 06 January 2026 00:29:11 +0000 (0:00:00.326) 0:04:15.317 ******* 2026-01-06 00:30:11.934316 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:30:11.934323 | orchestrator | 2026-01-06 00:30:11.934330 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2026-01-06 00:30:11.934336 | orchestrator | Tuesday 06 January 2026 00:29:11 +0000 (0:00:00.422) 0:04:15.739 ******* 2026-01-06 00:30:11.934343 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:11.934349 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:11.934356 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:11.934363 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:11.934369 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:11.934376 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:11.934382 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:11.934389 | orchestrator | 2026-01-06 00:30:11.934395 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2026-01-06 00:30:11.934402 | orchestrator | Tuesday 06 January 2026 00:29:47 +0000 (0:00:35.796) 0:04:51.535 ******* 2026-01-06 00:30:11.934408 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:11.934415 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:11.934422 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:11.934428 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:11.934435 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:11.934441 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:11.934448 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:11.934454 | orchestrator | 2026-01-06 00:30:11.934461 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2026-01-06 00:30:11.934467 | orchestrator | Tuesday 06 January 2026 00:29:55 +0000 (0:00:08.558) 0:05:00.093 ******* 2026-01-06 00:30:11.934474 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:11.934480 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:11.934487 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:11.934494 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:11.934500 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:11.934507 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:11.934513 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:11.934520 | orchestrator | 2026-01-06 00:30:11.934526 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2026-01-06 00:30:11.934533 | orchestrator | Tuesday 06 January 2026 00:30:04 +0000 (0:00:08.317) 0:05:08.411 ******* 2026-01-06 00:30:11.934542 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:11.934549 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:11.934556 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:11.934562 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:11.934569 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:11.934575 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:11.934582 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:11.934588 | orchestrator | 2026-01-06 00:30:11.934595 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2026-01-06 00:30:11.934602 | orchestrator | Tuesday 06 January 2026 00:30:06 +0000 (0:00:01.949) 0:05:10.360 ******* 2026-01-06 00:30:11.934609 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:11.934615 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:11.934622 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:11.934628 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:11.934639 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:11.934646 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:11.934652 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:11.934659 | orchestrator | 2026-01-06 00:30:11.934671 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2026-01-06 00:30:23.624287 | orchestrator | Tuesday 06 January 2026 00:30:11 +0000 (0:00:05.731) 0:05:16.092 ******* 2026-01-06 00:30:23.625625 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:30:23.626847 | orchestrator | 2026-01-06 00:30:23.626864 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2026-01-06 00:30:23.626871 | orchestrator | Tuesday 06 January 2026 00:30:12 +0000 (0:00:00.566) 0:05:16.659 ******* 2026-01-06 00:30:23.626878 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:23.626894 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:23.626900 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:23.626906 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:23.626912 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:23.626918 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:23.626924 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:23.626930 | orchestrator | 2026-01-06 00:30:23.626936 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2026-01-06 00:30:23.626942 | orchestrator | Tuesday 06 January 2026 00:30:13 +0000 (0:00:00.751) 0:05:17.410 ******* 2026-01-06 00:30:23.626948 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:23.626954 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:23.626960 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:23.626966 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:23.626971 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:23.626977 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:23.626982 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:23.626988 | orchestrator | 2026-01-06 00:30:23.626994 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2026-01-06 00:30:23.627000 | orchestrator | Tuesday 06 January 2026 00:30:15 +0000 (0:00:01.886) 0:05:19.297 ******* 2026-01-06 00:30:23.627005 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:30:23.627012 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:30:23.627017 | orchestrator | changed: [testbed-manager] 2026-01-06 00:30:23.627023 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:30:23.627029 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:30:23.627035 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:30:23.627041 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:30:23.627046 | orchestrator | 2026-01-06 00:30:23.627052 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2026-01-06 00:30:23.627057 | orchestrator | Tuesday 06 January 2026 00:30:15 +0000 (0:00:00.847) 0:05:20.144 ******* 2026-01-06 00:30:23.627062 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.627067 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.627072 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.627076 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:23.627081 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:23.627086 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:23.627091 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:23.627096 | orchestrator | 2026-01-06 00:30:23.627101 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2026-01-06 00:30:23.627106 | orchestrator | Tuesday 06 January 2026 00:30:16 +0000 (0:00:00.295) 0:05:20.440 ******* 2026-01-06 00:30:23.627110 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.627115 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.627120 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.627125 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:23.627129 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:23.627203 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:23.627210 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:23.627215 | orchestrator | 2026-01-06 00:30:23.627219 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2026-01-06 00:30:23.627224 | orchestrator | Tuesday 06 January 2026 00:30:16 +0000 (0:00:00.422) 0:05:20.862 ******* 2026-01-06 00:30:23.627229 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:23.627234 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:23.627239 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:23.627243 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:23.627248 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:23.627253 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:23.627257 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:23.627262 | orchestrator | 2026-01-06 00:30:23.627267 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2026-01-06 00:30:23.627272 | orchestrator | Tuesday 06 January 2026 00:30:16 +0000 (0:00:00.315) 0:05:21.178 ******* 2026-01-06 00:30:23.627276 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.627281 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.627286 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.627290 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:23.627295 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:23.627300 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:23.627304 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:23.627309 | orchestrator | 2026-01-06 00:30:23.627314 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2026-01-06 00:30:23.627334 | orchestrator | Tuesday 06 January 2026 00:30:17 +0000 (0:00:00.318) 0:05:21.496 ******* 2026-01-06 00:30:23.627339 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:23.627344 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:23.627348 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:23.627353 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:23.627358 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:23.627362 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:23.627367 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:23.627372 | orchestrator | 2026-01-06 00:30:23.627376 | orchestrator | TASK [osism.services.docker : Print used docker version] *********************** 2026-01-06 00:30:23.627381 | orchestrator | Tuesday 06 January 2026 00:30:17 +0000 (0:00:00.302) 0:05:21.799 ******* 2026-01-06 00:30:23.627386 | orchestrator | ok: [testbed-manager] =>  2026-01-06 00:30:23.627391 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627395 | orchestrator | ok: [testbed-node-3] =>  2026-01-06 00:30:23.627400 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627405 | orchestrator | ok: [testbed-node-4] =>  2026-01-06 00:30:23.627410 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627414 | orchestrator | ok: [testbed-node-5] =>  2026-01-06 00:30:23.627419 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627451 | orchestrator | ok: [testbed-node-0] =>  2026-01-06 00:30:23.627456 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627461 | orchestrator | ok: [testbed-node-1] =>  2026-01-06 00:30:23.627466 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627471 | orchestrator | ok: [testbed-node-2] =>  2026-01-06 00:30:23.627475 | orchestrator |  docker_version: 5:27.5.1 2026-01-06 00:30:23.627480 | orchestrator | 2026-01-06 00:30:23.627485 | orchestrator | TASK [osism.services.docker : Print used docker cli version] ******************* 2026-01-06 00:30:23.627490 | orchestrator | Tuesday 06 January 2026 00:30:17 +0000 (0:00:00.333) 0:05:22.133 ******* 2026-01-06 00:30:23.627494 | orchestrator | ok: [testbed-manager] =>  2026-01-06 00:30:23.627499 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.627504 | orchestrator | ok: [testbed-node-3] =>  2026-01-06 00:30:23.627509 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.627513 | orchestrator | ok: [testbed-node-4] =>  2026-01-06 00:30:23.627518 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.627523 | orchestrator | ok: [testbed-node-5] =>  2026-01-06 00:30:23.628656 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.628674 | orchestrator | ok: [testbed-node-0] =>  2026-01-06 00:30:23.628680 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.628685 | orchestrator | ok: [testbed-node-1] =>  2026-01-06 00:30:23.628689 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.628694 | orchestrator | ok: [testbed-node-2] =>  2026-01-06 00:30:23.628698 | orchestrator |  docker_cli_version: 5:27.5.1 2026-01-06 00:30:23.628703 | orchestrator | 2026-01-06 00:30:23.628708 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2026-01-06 00:30:23.628713 | orchestrator | Tuesday 06 January 2026 00:30:18 +0000 (0:00:00.329) 0:05:22.462 ******* 2026-01-06 00:30:23.628718 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.628722 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.628727 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.628732 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:23.628736 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:23.628741 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:23.628745 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:23.628749 | orchestrator | 2026-01-06 00:30:23.628754 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2026-01-06 00:30:23.628758 | orchestrator | Tuesday 06 January 2026 00:30:18 +0000 (0:00:00.283) 0:05:22.746 ******* 2026-01-06 00:30:23.628763 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.628767 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.628772 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.628776 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:30:23.628781 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:30:23.628785 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:30:23.628790 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:30:23.628794 | orchestrator | 2026-01-06 00:30:23.628799 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2026-01-06 00:30:23.628803 | orchestrator | Tuesday 06 January 2026 00:30:18 +0000 (0:00:00.305) 0:05:23.052 ******* 2026-01-06 00:30:23.628811 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:30:23.628818 | orchestrator | 2026-01-06 00:30:23.628822 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2026-01-06 00:30:23.628827 | orchestrator | Tuesday 06 January 2026 00:30:19 +0000 (0:00:00.421) 0:05:23.473 ******* 2026-01-06 00:30:23.628831 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:23.628837 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:23.628841 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:23.628846 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:23.628850 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:23.628855 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:23.628859 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:23.628864 | orchestrator | 2026-01-06 00:30:23.628868 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2026-01-06 00:30:23.628873 | orchestrator | Tuesday 06 January 2026 00:30:20 +0000 (0:00:01.018) 0:05:24.492 ******* 2026-01-06 00:30:23.628881 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:30:23.628888 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:30:23.628895 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:30:23.628902 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:30:23.628909 | orchestrator | ok: [testbed-manager] 2026-01-06 00:30:23.628916 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:30:23.628923 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:30:23.628930 | orchestrator | 2026-01-06 00:30:23.628937 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2026-01-06 00:30:23.628945 | orchestrator | Tuesday 06 January 2026 00:30:23 +0000 (0:00:02.874) 0:05:27.367 ******* 2026-01-06 00:30:23.628960 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2026-01-06 00:30:23.630084 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2026-01-06 00:30:23.630108 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2026-01-06 00:30:23.630121 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2026-01-06 00:30:23.630126 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2026-01-06 00:30:23.630131 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2026-01-06 00:30:23.630136 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:30:23.630141 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2026-01-06 00:30:23.630145 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2026-01-06 00:30:23.630187 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2026-01-06 00:30:23.630192 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:30:23.630197 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2026-01-06 00:30:23.630202 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2026-01-06 00:30:23.630206 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2026-01-06 00:30:23.630211 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:30:23.630216 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2026-01-06 00:30:23.630232 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2026-01-06 00:31:28.002238 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2026-01-06 00:31:28.002430 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:28.002474 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2026-01-06 00:31:28.002487 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2026-01-06 00:31:28.002498 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2026-01-06 00:31:28.002508 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:28.002518 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:28.002529 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2026-01-06 00:31:28.002538 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2026-01-06 00:31:28.002548 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2026-01-06 00:31:28.002558 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:28.002568 | orchestrator | 2026-01-06 00:31:28.002579 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2026-01-06 00:31:28.002591 | orchestrator | Tuesday 06 January 2026 00:30:23 +0000 (0:00:00.643) 0:05:28.010 ******* 2026-01-06 00:31:28.002601 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.002611 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.002621 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.002630 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.002640 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.002649 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.002659 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.002669 | orchestrator | 2026-01-06 00:31:28.002678 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2026-01-06 00:31:28.002688 | orchestrator | Tuesday 06 January 2026 00:30:31 +0000 (0:00:07.437) 0:05:35.448 ******* 2026-01-06 00:31:28.002698 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.002707 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.002717 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.002727 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.002736 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.002746 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.002756 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.002765 | orchestrator | 2026-01-06 00:31:28.002775 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2026-01-06 00:31:28.002785 | orchestrator | Tuesday 06 January 2026 00:30:32 +0000 (0:00:01.073) 0:05:36.521 ******* 2026-01-06 00:31:28.002794 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.002829 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.002839 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.002849 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.002858 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.002868 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.002878 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.002887 | orchestrator | 2026-01-06 00:31:28.002897 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2026-01-06 00:31:28.002907 | orchestrator | Tuesday 06 January 2026 00:30:40 +0000 (0:00:08.425) 0:05:44.947 ******* 2026-01-06 00:31:28.002916 | orchestrator | changed: [testbed-manager] 2026-01-06 00:31:28.002926 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.002936 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.002945 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.002955 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.002965 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.002974 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.002984 | orchestrator | 2026-01-06 00:31:28.002993 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2026-01-06 00:31:28.003003 | orchestrator | Tuesday 06 January 2026 00:30:44 +0000 (0:00:03.654) 0:05:48.601 ******* 2026-01-06 00:31:28.003013 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.003022 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003032 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003041 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003051 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003061 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003070 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003080 | orchestrator | 2026-01-06 00:31:28.003089 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2026-01-06 00:31:28.003099 | orchestrator | Tuesday 06 January 2026 00:30:45 +0000 (0:00:01.355) 0:05:49.957 ******* 2026-01-06 00:31:28.003109 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.003118 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003128 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003155 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003165 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003175 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003184 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003194 | orchestrator | 2026-01-06 00:31:28.003203 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2026-01-06 00:31:28.003213 | orchestrator | Tuesday 06 January 2026 00:30:47 +0000 (0:00:01.544) 0:05:51.502 ******* 2026-01-06 00:31:28.003223 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:28.003233 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:28.003256 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:28.003267 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:28.003277 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:28.003286 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:28.003296 | orchestrator | changed: [testbed-manager] 2026-01-06 00:31:28.003305 | orchestrator | 2026-01-06 00:31:28.003315 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2026-01-06 00:31:28.003325 | orchestrator | Tuesday 06 January 2026 00:30:48 +0000 (0:00:00.706) 0:05:52.208 ******* 2026-01-06 00:31:28.003335 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.003344 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003354 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003363 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003373 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003382 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003392 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003401 | orchestrator | 2026-01-06 00:31:28.003411 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2026-01-06 00:31:28.003452 | orchestrator | Tuesday 06 January 2026 00:30:58 +0000 (0:00:10.630) 0:06:02.839 ******* 2026-01-06 00:31:28.003463 | orchestrator | changed: [testbed-manager] 2026-01-06 00:31:28.003473 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003482 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003492 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003501 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003511 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003520 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003530 | orchestrator | 2026-01-06 00:31:28.003539 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2026-01-06 00:31:28.003549 | orchestrator | Tuesday 06 January 2026 00:30:59 +0000 (0:00:00.988) 0:06:03.828 ******* 2026-01-06 00:31:28.003559 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.003568 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003578 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003587 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003596 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003606 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003615 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003625 | orchestrator | 2026-01-06 00:31:28.003634 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2026-01-06 00:31:28.003644 | orchestrator | Tuesday 06 January 2026 00:31:09 +0000 (0:00:09.527) 0:06:13.355 ******* 2026-01-06 00:31:28.003654 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.003663 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.003673 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.003682 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.003692 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.003701 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.003710 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.003720 | orchestrator | 2026-01-06 00:31:28.003730 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2026-01-06 00:31:28.003739 | orchestrator | Tuesday 06 January 2026 00:31:20 +0000 (0:00:11.686) 0:06:25.042 ******* 2026-01-06 00:31:28.003749 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2026-01-06 00:31:28.003759 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2026-01-06 00:31:28.003768 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2026-01-06 00:31:28.003778 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2026-01-06 00:31:28.003787 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2026-01-06 00:31:28.003797 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2026-01-06 00:31:28.003807 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2026-01-06 00:31:28.003816 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2026-01-06 00:31:28.003826 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2026-01-06 00:31:28.003835 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2026-01-06 00:31:28.003845 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2026-01-06 00:31:28.003855 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2026-01-06 00:31:28.003864 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2026-01-06 00:31:28.003874 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2026-01-06 00:31:28.003883 | orchestrator | 2026-01-06 00:31:28.003893 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2026-01-06 00:31:28.003903 | orchestrator | Tuesday 06 January 2026 00:31:22 +0000 (0:00:01.279) 0:06:26.321 ******* 2026-01-06 00:31:28.003912 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:28.003922 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:28.003931 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:28.003940 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:28.003950 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:28.003959 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:28.003976 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:28.003985 | orchestrator | 2026-01-06 00:31:28.003995 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2026-01-06 00:31:28.004004 | orchestrator | Tuesday 06 January 2026 00:31:22 +0000 (0:00:00.580) 0:06:26.901 ******* 2026-01-06 00:31:28.004014 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:28.004024 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:28.004033 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:28.004043 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:28.004052 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:28.004061 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:28.004071 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:28.004080 | orchestrator | 2026-01-06 00:31:28.004090 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2026-01-06 00:31:28.004115 | orchestrator | Tuesday 06 January 2026 00:31:26 +0000 (0:00:04.268) 0:06:31.169 ******* 2026-01-06 00:31:28.004125 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:28.004134 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:28.004161 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:28.004171 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:28.004181 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:28.004190 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:28.004200 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:28.004210 | orchestrator | 2026-01-06 00:31:28.004220 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2026-01-06 00:31:28.004231 | orchestrator | Tuesday 06 January 2026 00:31:27 +0000 (0:00:00.530) 0:06:31.700 ******* 2026-01-06 00:31:28.004241 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2026-01-06 00:31:28.004251 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2026-01-06 00:31:28.004261 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:28.004270 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2026-01-06 00:31:28.004280 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2026-01-06 00:31:28.004290 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:28.004299 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2026-01-06 00:31:28.004367 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2026-01-06 00:31:28.004379 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2026-01-06 00:31:28.004397 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2026-01-06 00:31:48.122071 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:48.122232 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2026-01-06 00:31:48.122249 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2026-01-06 00:31:48.122261 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:48.122287 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2026-01-06 00:31:48.122309 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2026-01-06 00:31:48.122321 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:48.122332 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:48.122344 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2026-01-06 00:31:48.122355 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2026-01-06 00:31:48.122366 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:48.122378 | orchestrator | 2026-01-06 00:31:48.122390 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2026-01-06 00:31:48.122403 | orchestrator | Tuesday 06 January 2026 00:31:28 +0000 (0:00:00.756) 0:06:32.456 ******* 2026-01-06 00:31:48.122418 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:48.122437 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:48.122461 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:48.122488 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:48.122538 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:48.122559 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:48.122577 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:48.122595 | orchestrator | 2026-01-06 00:31:48.122613 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2026-01-06 00:31:48.122632 | orchestrator | Tuesday 06 January 2026 00:31:28 +0000 (0:00:00.533) 0:06:32.990 ******* 2026-01-06 00:31:48.122652 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:48.122671 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:48.122689 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:48.122703 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:48.122715 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:48.122728 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:48.122740 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:48.122752 | orchestrator | 2026-01-06 00:31:48.122765 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2026-01-06 00:31:48.122779 | orchestrator | Tuesday 06 January 2026 00:31:29 +0000 (0:00:00.521) 0:06:33.512 ******* 2026-01-06 00:31:48.122792 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:48.122804 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:31:48.122817 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:31:48.122829 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:31:48.122842 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:31:48.122854 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:31:48.122866 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:31:48.122879 | orchestrator | 2026-01-06 00:31:48.122893 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2026-01-06 00:31:48.122907 | orchestrator | Tuesday 06 January 2026 00:31:29 +0000 (0:00:00.542) 0:06:34.055 ******* 2026-01-06 00:31:48.122918 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.122929 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.122940 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.122951 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.122961 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.122972 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.122983 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.122993 | orchestrator | 2026-01-06 00:31:48.123004 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2026-01-06 00:31:48.123015 | orchestrator | Tuesday 06 January 2026 00:31:31 +0000 (0:00:02.097) 0:06:36.153 ******* 2026-01-06 00:31:48.123027 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:31:48.123040 | orchestrator | 2026-01-06 00:31:48.123052 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2026-01-06 00:31:48.123063 | orchestrator | Tuesday 06 January 2026 00:31:32 +0000 (0:00:00.910) 0:06:37.063 ******* 2026-01-06 00:31:48.123073 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.123085 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:48.123095 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:48.123106 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:48.123117 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:48.123127 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:48.123173 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:48.123184 | orchestrator | 2026-01-06 00:31:48.123195 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2026-01-06 00:31:48.123206 | orchestrator | Tuesday 06 January 2026 00:31:33 +0000 (0:00:00.869) 0:06:37.933 ******* 2026-01-06 00:31:48.123217 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.123241 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:48.123253 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:48.123263 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:48.123274 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:48.123294 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:48.123304 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:48.123315 | orchestrator | 2026-01-06 00:31:48.123326 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2026-01-06 00:31:48.123337 | orchestrator | Tuesday 06 January 2026 00:31:34 +0000 (0:00:00.850) 0:06:38.784 ******* 2026-01-06 00:31:48.123348 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.123358 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:48.123369 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:48.123380 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:48.123390 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:48.123401 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:48.123411 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:48.123422 | orchestrator | 2026-01-06 00:31:48.123433 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2026-01-06 00:31:48.123464 | orchestrator | Tuesday 06 January 2026 00:31:36 +0000 (0:00:01.686) 0:06:40.470 ******* 2026-01-06 00:31:48.123476 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:31:48.123486 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.123497 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.123508 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.123519 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.123529 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.123540 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.123551 | orchestrator | 2026-01-06 00:31:48.123562 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2026-01-06 00:31:48.123572 | orchestrator | Tuesday 06 January 2026 00:31:37 +0000 (0:00:01.358) 0:06:41.828 ******* 2026-01-06 00:31:48.123583 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.123594 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:48.123604 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:48.123615 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:48.123626 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:48.123636 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:48.123647 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:48.123658 | orchestrator | 2026-01-06 00:31:48.123668 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2026-01-06 00:31:48.123679 | orchestrator | Tuesday 06 January 2026 00:31:38 +0000 (0:00:01.345) 0:06:43.173 ******* 2026-01-06 00:31:48.123690 | orchestrator | changed: [testbed-manager] 2026-01-06 00:31:48.123701 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:31:48.123716 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:31:48.123734 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:31:48.123753 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:31:48.123775 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:31:48.123792 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:31:48.123810 | orchestrator | 2026-01-06 00:31:48.123828 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2026-01-06 00:31:48.123848 | orchestrator | Tuesday 06 January 2026 00:31:40 +0000 (0:00:01.448) 0:06:44.622 ******* 2026-01-06 00:31:48.123868 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:31:48.123890 | orchestrator | 2026-01-06 00:31:48.123909 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2026-01-06 00:31:48.123929 | orchestrator | Tuesday 06 January 2026 00:31:41 +0000 (0:00:01.198) 0:06:45.821 ******* 2026-01-06 00:31:48.123948 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.123969 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.123989 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.124010 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.124028 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.124048 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.124087 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.124099 | orchestrator | 2026-01-06 00:31:48.124110 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2026-01-06 00:31:48.124121 | orchestrator | Tuesday 06 January 2026 00:31:43 +0000 (0:00:01.557) 0:06:47.379 ******* 2026-01-06 00:31:48.124161 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.124174 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.124189 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.124208 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.124229 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.124248 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.124267 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.124287 | orchestrator | 2026-01-06 00:31:48.124306 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2026-01-06 00:31:48.124327 | orchestrator | Tuesday 06 January 2026 00:31:44 +0000 (0:00:01.175) 0:06:48.555 ******* 2026-01-06 00:31:48.124347 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.124359 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.124370 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.124380 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.124391 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.124401 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.124412 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.124423 | orchestrator | 2026-01-06 00:31:48.124434 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2026-01-06 00:31:48.124445 | orchestrator | Tuesday 06 January 2026 00:31:45 +0000 (0:00:01.126) 0:06:49.682 ******* 2026-01-06 00:31:48.124456 | orchestrator | ok: [testbed-manager] 2026-01-06 00:31:48.124467 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:31:48.124478 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:31:48.124489 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:31:48.124499 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:31:48.124510 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:31:48.124520 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:31:48.124531 | orchestrator | 2026-01-06 00:31:48.124542 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2026-01-06 00:31:48.124553 | orchestrator | Tuesday 06 January 2026 00:31:46 +0000 (0:00:01.385) 0:06:51.067 ******* 2026-01-06 00:31:48.124564 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:31:48.124575 | orchestrator | 2026-01-06 00:31:48.124587 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:31:48.124597 | orchestrator | Tuesday 06 January 2026 00:31:47 +0000 (0:00:00.925) 0:06:51.993 ******* 2026-01-06 00:31:48.124608 | orchestrator | 2026-01-06 00:31:48.124619 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:31:48.124630 | orchestrator | Tuesday 06 January 2026 00:31:47 +0000 (0:00:00.039) 0:06:52.033 ******* 2026-01-06 00:31:48.124641 | orchestrator | 2026-01-06 00:31:48.124652 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:31:48.124663 | orchestrator | Tuesday 06 January 2026 00:31:47 +0000 (0:00:00.039) 0:06:52.072 ******* 2026-01-06 00:31:48.124674 | orchestrator | 2026-01-06 00:31:48.124684 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:31:48.124695 | orchestrator | Tuesday 06 January 2026 00:31:47 +0000 (0:00:00.047) 0:06:52.120 ******* 2026-01-06 00:31:48.124706 | orchestrator | 2026-01-06 00:31:48.124727 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:32:14.545846 | orchestrator | Tuesday 06 January 2026 00:31:47 +0000 (0:00:00.040) 0:06:52.161 ******* 2026-01-06 00:32:14.545943 | orchestrator | 2026-01-06 00:32:14.545952 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:32:14.545959 | orchestrator | Tuesday 06 January 2026 00:31:48 +0000 (0:00:00.039) 0:06:52.200 ******* 2026-01-06 00:32:14.545991 | orchestrator | 2026-01-06 00:32:14.545998 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-01-06 00:32:14.546005 | orchestrator | Tuesday 06 January 2026 00:31:48 +0000 (0:00:00.047) 0:06:52.248 ******* 2026-01-06 00:32:14.546011 | orchestrator | 2026-01-06 00:32:14.546053 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-01-06 00:32:14.546060 | orchestrator | Tuesday 06 January 2026 00:31:48 +0000 (0:00:00.039) 0:06:52.288 ******* 2026-01-06 00:32:14.546067 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:14.546076 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:14.546082 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:14.546089 | orchestrator | 2026-01-06 00:32:14.546096 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2026-01-06 00:32:14.546103 | orchestrator | Tuesday 06 January 2026 00:31:49 +0000 (0:00:01.167) 0:06:53.455 ******* 2026-01-06 00:32:14.546110 | orchestrator | changed: [testbed-manager] 2026-01-06 00:32:14.546119 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:14.546148 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:14.546155 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:14.546162 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:14.546168 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:14.546175 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:14.546182 | orchestrator | 2026-01-06 00:32:14.546188 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart logrotate service] *********** 2026-01-06 00:32:14.546195 | orchestrator | Tuesday 06 January 2026 00:31:50 +0000 (0:00:01.333) 0:06:54.788 ******* 2026-01-06 00:32:14.546202 | orchestrator | changed: [testbed-manager] 2026-01-06 00:32:14.546208 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:14.546215 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:14.546221 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:14.546228 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:14.546234 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:14.546240 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:14.546247 | orchestrator | 2026-01-06 00:32:14.546253 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2026-01-06 00:32:14.546259 | orchestrator | Tuesday 06 January 2026 00:31:52 +0000 (0:00:01.447) 0:06:56.236 ******* 2026-01-06 00:32:14.546266 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:14.546272 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:14.546279 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:14.546285 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:14.546292 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:14.546298 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:14.546305 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:14.546311 | orchestrator | 2026-01-06 00:32:14.546318 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2026-01-06 00:32:14.546324 | orchestrator | Tuesday 06 January 2026 00:31:54 +0000 (0:00:02.461) 0:06:58.697 ******* 2026-01-06 00:32:14.546330 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:14.546337 | orchestrator | 2026-01-06 00:32:14.546343 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2026-01-06 00:32:14.546350 | orchestrator | Tuesday 06 January 2026 00:31:54 +0000 (0:00:00.118) 0:06:58.816 ******* 2026-01-06 00:32:14.546356 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.546363 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:14.546369 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:14.546376 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:14.546382 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:14.546389 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:14.546395 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:14.546402 | orchestrator | 2026-01-06 00:32:14.546409 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2026-01-06 00:32:14.546417 | orchestrator | Tuesday 06 January 2026 00:31:55 +0000 (0:00:01.112) 0:06:59.928 ******* 2026-01-06 00:32:14.546429 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:14.546435 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:14.546442 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:14.546448 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:14.546455 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:14.546461 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:14.546468 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:14.546474 | orchestrator | 2026-01-06 00:32:14.546481 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2026-01-06 00:32:14.546487 | orchestrator | Tuesday 06 January 2026 00:31:56 +0000 (0:00:00.546) 0:07:00.475 ******* 2026-01-06 00:32:14.546510 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:32:14.546518 | orchestrator | 2026-01-06 00:32:14.546525 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2026-01-06 00:32:14.546531 | orchestrator | Tuesday 06 January 2026 00:31:57 +0000 (0:00:01.234) 0:07:01.710 ******* 2026-01-06 00:32:14.546538 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.546544 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:14.546551 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:14.546557 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:14.546564 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:14.546570 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:14.546576 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:14.546583 | orchestrator | 2026-01-06 00:32:14.546589 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2026-01-06 00:32:14.546596 | orchestrator | Tuesday 06 January 2026 00:31:58 +0000 (0:00:00.886) 0:07:02.596 ******* 2026-01-06 00:32:14.546603 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2026-01-06 00:32:14.546609 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2026-01-06 00:32:14.546631 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2026-01-06 00:32:14.546638 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2026-01-06 00:32:14.546645 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2026-01-06 00:32:14.546651 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2026-01-06 00:32:14.546658 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2026-01-06 00:32:14.546664 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2026-01-06 00:32:14.546671 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2026-01-06 00:32:14.546678 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2026-01-06 00:32:14.546684 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2026-01-06 00:32:14.546691 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2026-01-06 00:32:14.546697 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2026-01-06 00:32:14.546703 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2026-01-06 00:32:14.546710 | orchestrator | 2026-01-06 00:32:14.546717 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2026-01-06 00:32:14.546723 | orchestrator | Tuesday 06 January 2026 00:32:00 +0000 (0:00:02.437) 0:07:05.033 ******* 2026-01-06 00:32:14.546730 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:14.546736 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:14.546742 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:14.546749 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:14.546755 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:14.546762 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:14.546768 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:14.546774 | orchestrator | 2026-01-06 00:32:14.546781 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2026-01-06 00:32:14.546793 | orchestrator | Tuesday 06 January 2026 00:32:01 +0000 (0:00:00.729) 0:07:05.763 ******* 2026-01-06 00:32:14.548201 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:32:14.548254 | orchestrator | 2026-01-06 00:32:14.548261 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2026-01-06 00:32:14.548269 | orchestrator | Tuesday 06 January 2026 00:32:02 +0000 (0:00:00.847) 0:07:06.611 ******* 2026-01-06 00:32:14.548277 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.548284 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:14.548291 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:14.548298 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:14.548304 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:14.548311 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:14.548317 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:14.548324 | orchestrator | 2026-01-06 00:32:14.548330 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2026-01-06 00:32:14.548336 | orchestrator | Tuesday 06 January 2026 00:32:03 +0000 (0:00:00.789) 0:07:07.400 ******* 2026-01-06 00:32:14.548343 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.548349 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:14.548356 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:14.548362 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:14.548368 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:14.548375 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:14.548381 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:14.548387 | orchestrator | 2026-01-06 00:32:14.548394 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2026-01-06 00:32:14.548401 | orchestrator | Tuesday 06 January 2026 00:32:04 +0000 (0:00:00.891) 0:07:08.292 ******* 2026-01-06 00:32:14.548407 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:14.548414 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:14.548421 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:14.548427 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:14.548434 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:14.548441 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:14.548447 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:14.548453 | orchestrator | 2026-01-06 00:32:14.548460 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2026-01-06 00:32:14.548466 | orchestrator | Tuesday 06 January 2026 00:32:04 +0000 (0:00:00.451) 0:07:08.744 ******* 2026-01-06 00:32:14.548472 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.548479 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:14.548485 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:14.548491 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:14.548498 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:14.548504 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:14.548511 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:14.548517 | orchestrator | 2026-01-06 00:32:14.548523 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2026-01-06 00:32:14.548541 | orchestrator | Tuesday 06 January 2026 00:32:05 +0000 (0:00:01.412) 0:07:10.156 ******* 2026-01-06 00:32:14.548547 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:14.548553 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:14.548560 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:14.548566 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:14.548572 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:14.548578 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:14.548584 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:14.548589 | orchestrator | 2026-01-06 00:32:14.548595 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2026-01-06 00:32:14.548600 | orchestrator | Tuesday 06 January 2026 00:32:06 +0000 (0:00:00.529) 0:07:10.685 ******* 2026-01-06 00:32:14.548617 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:14.548623 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:14.548630 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:14.548636 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:14.548642 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:14.548649 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:14.548655 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:14.548661 | orchestrator | 2026-01-06 00:32:14.548680 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2026-01-06 00:32:48.394852 | orchestrator | Tuesday 06 January 2026 00:32:14 +0000 (0:00:08.029) 0:07:18.715 ******* 2026-01-06 00:32:48.394968 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.394984 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:48.394996 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:48.395007 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:48.395018 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:48.395028 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:48.395039 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:48.395051 | orchestrator | 2026-01-06 00:32:48.395063 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2026-01-06 00:32:48.395074 | orchestrator | Tuesday 06 January 2026 00:32:16 +0000 (0:00:01.577) 0:07:20.292 ******* 2026-01-06 00:32:48.395085 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395096 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:48.395144 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:48.395156 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:48.395167 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:48.395178 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:48.395189 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:48.395200 | orchestrator | 2026-01-06 00:32:48.395211 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2026-01-06 00:32:48.395223 | orchestrator | Tuesday 06 January 2026 00:32:17 +0000 (0:00:01.807) 0:07:22.100 ******* 2026-01-06 00:32:48.395234 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395245 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:48.395256 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:48.395267 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:48.395278 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:48.395289 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:48.395300 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:48.395311 | orchestrator | 2026-01-06 00:32:48.395322 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-01-06 00:32:48.395333 | orchestrator | Tuesday 06 January 2026 00:32:19 +0000 (0:00:01.762) 0:07:23.863 ******* 2026-01-06 00:32:48.395344 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395355 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.395366 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.395377 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.395391 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.395403 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.395416 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.395428 | orchestrator | 2026-01-06 00:32:48.395441 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-01-06 00:32:48.395454 | orchestrator | Tuesday 06 January 2026 00:32:20 +0000 (0:00:00.876) 0:07:24.740 ******* 2026-01-06 00:32:48.395466 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:48.395479 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:48.395491 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:48.395504 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:48.395516 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:48.395529 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:48.395542 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:48.395555 | orchestrator | 2026-01-06 00:32:48.395568 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2026-01-06 00:32:48.395610 | orchestrator | Tuesday 06 January 2026 00:32:21 +0000 (0:00:01.017) 0:07:25.757 ******* 2026-01-06 00:32:48.395623 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:48.395635 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:48.395647 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:48.395660 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:48.395672 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:48.395684 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:48.395697 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:48.395708 | orchestrator | 2026-01-06 00:32:48.395721 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2026-01-06 00:32:48.395734 | orchestrator | Tuesday 06 January 2026 00:32:22 +0000 (0:00:00.551) 0:07:26.308 ******* 2026-01-06 00:32:48.395745 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395756 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.395767 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.395777 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.395788 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.395799 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.395809 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.395820 | orchestrator | 2026-01-06 00:32:48.395832 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2026-01-06 00:32:48.395843 | orchestrator | Tuesday 06 January 2026 00:32:22 +0000 (0:00:00.555) 0:07:26.864 ******* 2026-01-06 00:32:48.395853 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395864 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.395875 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.395886 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.395897 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.395907 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.395918 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.395929 | orchestrator | 2026-01-06 00:32:48.395940 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2026-01-06 00:32:48.395951 | orchestrator | Tuesday 06 January 2026 00:32:23 +0000 (0:00:00.573) 0:07:27.437 ******* 2026-01-06 00:32:48.395962 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.395973 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.395984 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.395994 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.396005 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.396016 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.396027 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.396037 | orchestrator | 2026-01-06 00:32:48.396048 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2026-01-06 00:32:48.396059 | orchestrator | Tuesday 06 January 2026 00:32:24 +0000 (0:00:00.762) 0:07:28.200 ******* 2026-01-06 00:32:48.396070 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.396081 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.396091 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.396120 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.396131 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.396142 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.396152 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.396163 | orchestrator | 2026-01-06 00:32:48.396174 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2026-01-06 00:32:48.396201 | orchestrator | Tuesday 06 January 2026 00:32:29 +0000 (0:00:05.665) 0:07:33.866 ******* 2026-01-06 00:32:48.396213 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:32:48.396224 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:32:48.396235 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:32:48.396246 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:32:48.396256 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:32:48.396267 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:32:48.396278 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:32:48.396297 | orchestrator | 2026-01-06 00:32:48.396311 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2026-01-06 00:32:48.396355 | orchestrator | Tuesday 06 January 2026 00:32:30 +0000 (0:00:00.594) 0:07:34.460 ******* 2026-01-06 00:32:48.396378 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:32:48.396399 | orchestrator | 2026-01-06 00:32:48.396417 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2026-01-06 00:32:48.396437 | orchestrator | Tuesday 06 January 2026 00:32:31 +0000 (0:00:01.142) 0:07:35.603 ******* 2026-01-06 00:32:48.396456 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.396474 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.396490 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.396501 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.396512 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.396522 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.396533 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.396543 | orchestrator | 2026-01-06 00:32:48.396554 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2026-01-06 00:32:48.396565 | orchestrator | Tuesday 06 January 2026 00:32:33 +0000 (0:00:02.041) 0:07:37.645 ******* 2026-01-06 00:32:48.396576 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.396587 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.396597 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.396608 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.396618 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.396629 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.396639 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.396650 | orchestrator | 2026-01-06 00:32:48.396661 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2026-01-06 00:32:48.396672 | orchestrator | Tuesday 06 January 2026 00:32:34 +0000 (0:00:01.223) 0:07:38.869 ******* 2026-01-06 00:32:48.396683 | orchestrator | ok: [testbed-manager] 2026-01-06 00:32:48.396693 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:32:48.396704 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:32:48.396714 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:32:48.396726 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:32:48.396745 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:32:48.396762 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:32:48.396780 | orchestrator | 2026-01-06 00:32:48.396796 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2026-01-06 00:32:48.396813 | orchestrator | Tuesday 06 January 2026 00:32:35 +0000 (0:00:00.883) 0:07:39.753 ******* 2026-01-06 00:32:48.396832 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.396916 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.396939 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.396957 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.396975 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.396993 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.397011 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-01-06 00:32:48.397029 | orchestrator | 2026-01-06 00:32:48.397059 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2026-01-06 00:32:48.397086 | orchestrator | Tuesday 06 January 2026 00:32:37 +0000 (0:00:02.009) 0:07:41.763 ******* 2026-01-06 00:32:48.397136 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:32:48.397157 | orchestrator | 2026-01-06 00:32:48.397176 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2026-01-06 00:32:48.397194 | orchestrator | Tuesday 06 January 2026 00:32:38 +0000 (0:00:00.856) 0:07:42.620 ******* 2026-01-06 00:32:48.397212 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:32:48.397229 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:32:48.397248 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:32:48.397267 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:32:48.397284 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:32:48.397302 | orchestrator | changed: [testbed-manager] 2026-01-06 00:32:48.397319 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:32:48.397337 | orchestrator | 2026-01-06 00:32:48.397355 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2026-01-06 00:32:48.397391 | orchestrator | Tuesday 06 January 2026 00:32:48 +0000 (0:00:09.943) 0:07:52.564 ******* 2026-01-06 00:33:20.626295 | orchestrator | ok: [testbed-manager] 2026-01-06 00:33:20.626423 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:20.626439 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:20.626451 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:20.626462 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:20.626473 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:20.626484 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:20.626495 | orchestrator | 2026-01-06 00:33:20.626507 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2026-01-06 00:33:20.626520 | orchestrator | Tuesday 06 January 2026 00:32:50 +0000 (0:00:02.092) 0:07:54.657 ******* 2026-01-06 00:33:20.626531 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:20.626542 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:20.626552 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:20.626563 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:20.626574 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:20.626585 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:20.626596 | orchestrator | 2026-01-06 00:33:20.626607 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2026-01-06 00:33:20.626618 | orchestrator | Tuesday 06 January 2026 00:32:51 +0000 (0:00:01.390) 0:07:56.047 ******* 2026-01-06 00:33:20.626630 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.626641 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.626652 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.626663 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.626674 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.626684 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.626695 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.626706 | orchestrator | 2026-01-06 00:33:20.626716 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2026-01-06 00:33:20.626727 | orchestrator | 2026-01-06 00:33:20.626738 | orchestrator | TASK [Include hardening role] ************************************************** 2026-01-06 00:33:20.626749 | orchestrator | Tuesday 06 January 2026 00:32:53 +0000 (0:00:01.282) 0:07:57.329 ******* 2026-01-06 00:33:20.626760 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:33:20.626771 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:33:20.626782 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:33:20.626792 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:33:20.626803 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:33:20.626814 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:33:20.626824 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:33:20.626835 | orchestrator | 2026-01-06 00:33:20.626880 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2026-01-06 00:33:20.626891 | orchestrator | 2026-01-06 00:33:20.626902 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2026-01-06 00:33:20.626913 | orchestrator | Tuesday 06 January 2026 00:32:53 +0000 (0:00:00.764) 0:07:58.094 ******* 2026-01-06 00:33:20.626924 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.626935 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.626945 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.626956 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.626967 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.626978 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.626989 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627000 | orchestrator | 2026-01-06 00:33:20.627010 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2026-01-06 00:33:20.627021 | orchestrator | Tuesday 06 January 2026 00:32:55 +0000 (0:00:01.395) 0:07:59.490 ******* 2026-01-06 00:33:20.627032 | orchestrator | ok: [testbed-manager] 2026-01-06 00:33:20.627043 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:20.627054 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:20.627064 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:20.627106 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:20.627119 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:20.627129 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:20.627140 | orchestrator | 2026-01-06 00:33:20.627151 | orchestrator | TASK [Include auditd role] ***************************************************** 2026-01-06 00:33:20.627161 | orchestrator | Tuesday 06 January 2026 00:32:56 +0000 (0:00:01.481) 0:08:00.971 ******* 2026-01-06 00:33:20.627172 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:33:20.627183 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:33:20.627194 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:33:20.627204 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:33:20.627215 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:33:20.627226 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:33:20.627236 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:33:20.627247 | orchestrator | 2026-01-06 00:33:20.627257 | orchestrator | TASK [Include smartd role] ***************************************************** 2026-01-06 00:33:20.627268 | orchestrator | Tuesday 06 January 2026 00:32:57 +0000 (0:00:00.521) 0:08:01.493 ******* 2026-01-06 00:33:20.627279 | orchestrator | included: osism.services.smartd for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:33:20.627292 | orchestrator | 2026-01-06 00:33:20.627303 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2026-01-06 00:33:20.627331 | orchestrator | Tuesday 06 January 2026 00:32:58 +0000 (0:00:01.101) 0:08:02.595 ******* 2026-01-06 00:33:20.627345 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:33:20.627358 | orchestrator | 2026-01-06 00:33:20.627369 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2026-01-06 00:33:20.627380 | orchestrator | Tuesday 06 January 2026 00:32:59 +0000 (0:00:00.874) 0:08:03.469 ******* 2026-01-06 00:33:20.627390 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627401 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627412 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.627423 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.627433 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627444 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627454 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627465 | orchestrator | 2026-01-06 00:33:20.627476 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2026-01-06 00:33:20.627506 | orchestrator | Tuesday 06 January 2026 00:33:08 +0000 (0:00:09.090) 0:08:12.559 ******* 2026-01-06 00:33:20.627526 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627537 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627548 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627559 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627569 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.627580 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.627590 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627601 | orchestrator | 2026-01-06 00:33:20.627612 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2026-01-06 00:33:20.627623 | orchestrator | Tuesday 06 January 2026 00:33:09 +0000 (0:00:01.165) 0:08:13.725 ******* 2026-01-06 00:33:20.627633 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627644 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627654 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627665 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627676 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.627686 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.627697 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627707 | orchestrator | 2026-01-06 00:33:20.627718 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2026-01-06 00:33:20.627729 | orchestrator | Tuesday 06 January 2026 00:33:10 +0000 (0:00:01.388) 0:08:15.114 ******* 2026-01-06 00:33:20.627739 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627750 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627761 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627771 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.627782 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627792 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.627803 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627814 | orchestrator | 2026-01-06 00:33:20.627825 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2026-01-06 00:33:20.627835 | orchestrator | Tuesday 06 January 2026 00:33:13 +0000 (0:00:02.094) 0:08:17.208 ******* 2026-01-06 00:33:20.627846 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627857 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627867 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627878 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627888 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.627899 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.627909 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.627920 | orchestrator | 2026-01-06 00:33:20.627931 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2026-01-06 00:33:20.627942 | orchestrator | Tuesday 06 January 2026 00:33:14 +0000 (0:00:01.316) 0:08:18.525 ******* 2026-01-06 00:33:20.627952 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.627963 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.627974 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.627984 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.627998 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.628016 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.628044 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.628063 | orchestrator | 2026-01-06 00:33:20.628133 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2026-01-06 00:33:20.628150 | orchestrator | 2026-01-06 00:33:20.628165 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2026-01-06 00:33:20.628182 | orchestrator | Tuesday 06 January 2026 00:33:15 +0000 (0:00:01.156) 0:08:19.681 ******* 2026-01-06 00:33:20.628202 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:33:20.628222 | orchestrator | 2026-01-06 00:33:20.628239 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-01-06 00:33:20.628270 | orchestrator | Tuesday 06 January 2026 00:33:16 +0000 (0:00:00.903) 0:08:20.584 ******* 2026-01-06 00:33:20.628288 | orchestrator | ok: [testbed-manager] 2026-01-06 00:33:20.628306 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:20.628325 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:20.628344 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:20.628364 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:20.628382 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:20.628401 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:20.628418 | orchestrator | 2026-01-06 00:33:20.628429 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-01-06 00:33:20.628440 | orchestrator | Tuesday 06 January 2026 00:33:17 +0000 (0:00:01.114) 0:08:21.699 ******* 2026-01-06 00:33:20.628451 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:20.628462 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:20.628473 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:20.628483 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:20.628494 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:20.628504 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:20.628515 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:20.628525 | orchestrator | 2026-01-06 00:33:20.628544 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2026-01-06 00:33:20.628555 | orchestrator | Tuesday 06 January 2026 00:33:18 +0000 (0:00:01.168) 0:08:22.868 ******* 2026-01-06 00:33:20.628566 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:33:20.628577 | orchestrator | 2026-01-06 00:33:20.628588 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-01-06 00:33:20.628599 | orchestrator | Tuesday 06 January 2026 00:33:19 +0000 (0:00:01.053) 0:08:23.921 ******* 2026-01-06 00:33:20.628609 | orchestrator | ok: [testbed-manager] 2026-01-06 00:33:20.628620 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:20.628631 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:20.628641 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:20.628652 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:20.628663 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:20.628673 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:20.628684 | orchestrator | 2026-01-06 00:33:20.628694 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-01-06 00:33:20.628717 | orchestrator | Tuesday 06 January 2026 00:33:20 +0000 (0:00:00.869) 0:08:24.791 ******* 2026-01-06 00:33:22.233699 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:22.234541 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:22.234580 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:22.234595 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:22.234609 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:22.234621 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:22.234633 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:22.234644 | orchestrator | 2026-01-06 00:33:22.234657 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:33:22.234671 | orchestrator | testbed-manager : ok=168  changed=40  unreachable=0 failed=0 skipped=42  rescued=0 ignored=0 2026-01-06 00:33:22.234684 | orchestrator | testbed-node-0 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-01-06 00:33:22.234695 | orchestrator | testbed-node-1 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-01-06 00:33:22.234706 | orchestrator | testbed-node-2 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-01-06 00:33:22.234716 | orchestrator | testbed-node-3 : ok=175  changed=65  unreachable=0 failed=0 skipped=38  rescued=0 ignored=0 2026-01-06 00:33:22.234768 | orchestrator | testbed-node-4 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-01-06 00:33:22.234790 | orchestrator | testbed-node-5 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-01-06 00:33:22.234808 | orchestrator | 2026-01-06 00:33:22.234827 | orchestrator | 2026-01-06 00:33:22.234844 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:33:22.234856 | orchestrator | Tuesday 06 January 2026 00:33:21 +0000 (0:00:01.122) 0:08:25.914 ******* 2026-01-06 00:33:22.234866 | orchestrator | =============================================================================== 2026-01-06 00:33:22.234877 | orchestrator | osism.commons.packages : Install required packages --------------------- 81.47s 2026-01-06 00:33:22.234888 | orchestrator | osism.commons.packages : Download required packages -------------------- 41.03s 2026-01-06 00:33:22.234899 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 35.80s 2026-01-06 00:33:22.234910 | orchestrator | osism.commons.repository : Update package cache ------------------------ 15.86s 2026-01-06 00:33:22.234921 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 12.64s 2026-01-06 00:33:22.234932 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 12.56s 2026-01-06 00:33:22.234944 | orchestrator | osism.services.docker : Install docker package ------------------------- 11.69s 2026-01-06 00:33:22.234955 | orchestrator | osism.services.docker : Install containerd package --------------------- 10.63s 2026-01-06 00:33:22.234966 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 9.94s 2026-01-06 00:33:22.234977 | orchestrator | osism.services.docker : Install docker-cli package ---------------------- 9.53s 2026-01-06 00:33:22.234988 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 9.09s 2026-01-06 00:33:22.234998 | orchestrator | osism.services.rng : Install rng package -------------------------------- 8.69s 2026-01-06 00:33:22.235009 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 8.56s 2026-01-06 00:33:22.235020 | orchestrator | osism.services.docker : Add repository ---------------------------------- 8.43s 2026-01-06 00:33:22.235031 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 8.32s 2026-01-06 00:33:22.235042 | orchestrator | osism.commons.docker_compose : Install docker-compose-plugin package ---- 8.03s 2026-01-06 00:33:22.235053 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 7.44s 2026-01-06 00:33:22.235064 | orchestrator | osism.commons.services : Populate service facts ------------------------- 5.97s 2026-01-06 00:33:22.235103 | orchestrator | osism.commons.cleanup : Populate service facts -------------------------- 5.95s 2026-01-06 00:33:22.235142 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 5.73s 2026-01-06 00:33:22.602945 | orchestrator | + osism apply fail2ban 2026-01-06 00:33:35.439956 | orchestrator | 2026-01-06 00:33:35 | INFO  | Task b66fa265-9ca9-4258-9572-3d61c1dd782e (fail2ban) was prepared for execution. 2026-01-06 00:33:35.440132 | orchestrator | 2026-01-06 00:33:35 | INFO  | It takes a moment until task b66fa265-9ca9-4258-9572-3d61c1dd782e (fail2ban) has been started and output is visible here. 2026-01-06 00:33:58.140373 | orchestrator | 2026-01-06 00:33:58.140494 | orchestrator | PLAY [Apply role fail2ban] ***************************************************** 2026-01-06 00:33:58.140510 | orchestrator | 2026-01-06 00:33:58.140522 | orchestrator | TASK [osism.services.fail2ban : Include distribution specific install tasks] *** 2026-01-06 00:33:58.140534 | orchestrator | Tuesday 06 January 2026 00:33:40 +0000 (0:00:00.264) 0:00:00.264 ******* 2026-01-06 00:33:58.140547 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/fail2ban/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:33:58.140594 | orchestrator | 2026-01-06 00:33:58.140606 | orchestrator | TASK [osism.services.fail2ban : Install fail2ban package] ********************** 2026-01-06 00:33:58.140616 | orchestrator | Tuesday 06 January 2026 00:33:41 +0000 (0:00:01.179) 0:00:01.443 ******* 2026-01-06 00:33:58.140627 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:58.140639 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:58.140649 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:58.140660 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:58.140671 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:58.140682 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:58.140692 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:58.140703 | orchestrator | 2026-01-06 00:33:58.140714 | orchestrator | TASK [osism.services.fail2ban : Copy configuration files] ********************** 2026-01-06 00:33:58.140725 | orchestrator | Tuesday 06 January 2026 00:33:53 +0000 (0:00:11.718) 0:00:13.162 ******* 2026-01-06 00:33:58.140736 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:58.140746 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:58.140757 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:58.140768 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:58.140778 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:58.140789 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:58.140799 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:58.140810 | orchestrator | 2026-01-06 00:33:58.140821 | orchestrator | TASK [osism.services.fail2ban : Manage fail2ban service] *********************** 2026-01-06 00:33:58.140832 | orchestrator | Tuesday 06 January 2026 00:33:54 +0000 (0:00:01.514) 0:00:14.676 ******* 2026-01-06 00:33:58.140842 | orchestrator | ok: [testbed-manager] 2026-01-06 00:33:58.140854 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:33:58.140865 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:33:58.140876 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:33:58.140887 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:33:58.140897 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:33:58.140908 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:33:58.140951 | orchestrator | 2026-01-06 00:33:58.140965 | orchestrator | TASK [osism.services.fail2ban : Reload fail2ban configuration] ***************** 2026-01-06 00:33:58.140979 | orchestrator | Tuesday 06 January 2026 00:33:56 +0000 (0:00:01.514) 0:00:16.191 ******* 2026-01-06 00:33:58.140993 | orchestrator | changed: [testbed-manager] 2026-01-06 00:33:58.141007 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:33:58.141020 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:33:58.141033 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:33:58.141068 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:33:58.141080 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:33:58.141091 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:33:58.141102 | orchestrator | 2026-01-06 00:33:58.141113 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:33:58.141124 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141136 | orchestrator | testbed-node-0 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141147 | orchestrator | testbed-node-1 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141158 | orchestrator | testbed-node-2 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141169 | orchestrator | testbed-node-3 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141180 | orchestrator | testbed-node-4 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141190 | orchestrator | testbed-node-5 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:33:58.141210 | orchestrator | 2026-01-06 00:33:58.141221 | orchestrator | 2026-01-06 00:33:58.141232 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:33:58.141242 | orchestrator | Tuesday 06 January 2026 00:33:57 +0000 (0:00:01.678) 0:00:17.870 ******* 2026-01-06 00:33:58.141253 | orchestrator | =============================================================================== 2026-01-06 00:33:58.141264 | orchestrator | osism.services.fail2ban : Install fail2ban package --------------------- 11.72s 2026-01-06 00:33:58.141292 | orchestrator | osism.services.fail2ban : Reload fail2ban configuration ----------------- 1.68s 2026-01-06 00:33:58.141304 | orchestrator | osism.services.fail2ban : Manage fail2ban service ----------------------- 1.51s 2026-01-06 00:33:58.141315 | orchestrator | osism.services.fail2ban : Copy configuration files ---------------------- 1.51s 2026-01-06 00:33:58.141325 | orchestrator | osism.services.fail2ban : Include distribution specific install tasks --- 1.18s 2026-01-06 00:33:58.449160 | orchestrator | + [[ -e /etc/redhat-release ]] 2026-01-06 00:33:58.449268 | orchestrator | + osism apply network 2026-01-06 00:34:10.525843 | orchestrator | 2026-01-06 00:34:10 | INFO  | Task e7ccb486-adcd-4510-8f63-5756bf5adfc2 (network) was prepared for execution. 2026-01-06 00:34:10.525966 | orchestrator | 2026-01-06 00:34:10 | INFO  | It takes a moment until task e7ccb486-adcd-4510-8f63-5756bf5adfc2 (network) has been started and output is visible here. 2026-01-06 00:34:40.568208 | orchestrator | 2026-01-06 00:34:40.568301 | orchestrator | PLAY [Apply role network] ****************************************************** 2026-01-06 00:34:40.568310 | orchestrator | 2026-01-06 00:34:40.568316 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2026-01-06 00:34:40.568322 | orchestrator | Tuesday 06 January 2026 00:34:15 +0000 (0:00:00.274) 0:00:00.274 ******* 2026-01-06 00:34:40.568327 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568334 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.568339 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.568344 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.568349 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.568353 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.568358 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.568362 | orchestrator | 2026-01-06 00:34:40.568367 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2026-01-06 00:34:40.568372 | orchestrator | Tuesday 06 January 2026 00:34:16 +0000 (0:00:00.752) 0:00:01.026 ******* 2026-01-06 00:34:40.568379 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:34:40.568385 | orchestrator | 2026-01-06 00:34:40.568390 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2026-01-06 00:34:40.568394 | orchestrator | Tuesday 06 January 2026 00:34:17 +0000 (0:00:01.246) 0:00:02.273 ******* 2026-01-06 00:34:40.568399 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568403 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.568408 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.568413 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.568417 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.568422 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.568426 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.568431 | orchestrator | 2026-01-06 00:34:40.568435 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2026-01-06 00:34:40.568440 | orchestrator | Tuesday 06 January 2026 00:34:19 +0000 (0:00:02.241) 0:00:04.514 ******* 2026-01-06 00:34:40.568445 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.568449 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568454 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.568459 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.568482 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.568487 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.568491 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.568496 | orchestrator | 2026-01-06 00:34:40.568500 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2026-01-06 00:34:40.568505 | orchestrator | Tuesday 06 January 2026 00:34:21 +0000 (0:00:01.982) 0:00:06.497 ******* 2026-01-06 00:34:40.568510 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2026-01-06 00:34:40.568515 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2026-01-06 00:34:40.568519 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2026-01-06 00:34:40.568524 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2026-01-06 00:34:40.568528 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2026-01-06 00:34:40.568533 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2026-01-06 00:34:40.568537 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2026-01-06 00:34:40.568542 | orchestrator | 2026-01-06 00:34:40.568547 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2026-01-06 00:34:40.568551 | orchestrator | Tuesday 06 January 2026 00:34:22 +0000 (0:00:01.114) 0:00:07.612 ******* 2026-01-06 00:34:40.568556 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-01-06 00:34:40.568561 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-01-06 00:34:40.568565 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:34:40.568570 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-01-06 00:34:40.568574 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:34:40.568579 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-01-06 00:34:40.568583 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-01-06 00:34:40.568588 | orchestrator | 2026-01-06 00:34:40.568592 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2026-01-06 00:34:40.568597 | orchestrator | Tuesday 06 January 2026 00:34:26 +0000 (0:00:03.469) 0:00:11.081 ******* 2026-01-06 00:34:40.568602 | orchestrator | changed: [testbed-manager] 2026-01-06 00:34:40.568606 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:34:40.568611 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:34:40.568615 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:34:40.568620 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:34:40.568624 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:34:40.568628 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:34:40.568633 | orchestrator | 2026-01-06 00:34:40.568637 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2026-01-06 00:34:40.568642 | orchestrator | Tuesday 06 January 2026 00:34:28 +0000 (0:00:01.799) 0:00:12.881 ******* 2026-01-06 00:34:40.568647 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:34:40.568651 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:34:40.568656 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-01-06 00:34:40.568660 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-01-06 00:34:40.568665 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-01-06 00:34:40.568670 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-01-06 00:34:40.568675 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-01-06 00:34:40.568679 | orchestrator | 2026-01-06 00:34:40.568684 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2026-01-06 00:34:40.568689 | orchestrator | Tuesday 06 January 2026 00:34:29 +0000 (0:00:01.845) 0:00:14.726 ******* 2026-01-06 00:34:40.568693 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568698 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.568702 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.568707 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.568711 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.568716 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.568736 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.568741 | orchestrator | 2026-01-06 00:34:40.568745 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2026-01-06 00:34:40.568765 | orchestrator | Tuesday 06 January 2026 00:34:31 +0000 (0:00:01.210) 0:00:15.936 ******* 2026-01-06 00:34:40.568771 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:34:40.568776 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:34:40.568782 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:34:40.568787 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:34:40.568793 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:34:40.568798 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:34:40.568803 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:34:40.568809 | orchestrator | 2026-01-06 00:34:40.568814 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2026-01-06 00:34:40.568819 | orchestrator | Tuesday 06 January 2026 00:34:31 +0000 (0:00:00.670) 0:00:16.607 ******* 2026-01-06 00:34:40.568825 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568830 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.568835 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.568840 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.568845 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.568851 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.568856 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.568861 | orchestrator | 2026-01-06 00:34:40.568867 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2026-01-06 00:34:40.568872 | orchestrator | Tuesday 06 January 2026 00:34:34 +0000 (0:00:02.375) 0:00:18.982 ******* 2026-01-06 00:34:40.568878 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:34:40.568883 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:34:40.568888 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:34:40.568894 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:34:40.568899 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:34:40.568904 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:34:40.568910 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/iptables.sh', 'src': '/opt/configuration/network/iptables.sh'}) 2026-01-06 00:34:40.568916 | orchestrator | 2026-01-06 00:34:40.568922 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2026-01-06 00:34:40.568927 | orchestrator | Tuesday 06 January 2026 00:34:34 +0000 (0:00:00.805) 0:00:19.788 ******* 2026-01-06 00:34:40.568932 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.568937 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:34:40.568943 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:34:40.568948 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:34:40.568954 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:34:40.568959 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:34:40.568964 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:34:40.568970 | orchestrator | 2026-01-06 00:34:40.568975 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2026-01-06 00:34:40.568980 | orchestrator | Tuesday 06 January 2026 00:34:36 +0000 (0:00:01.608) 0:00:21.397 ******* 2026-01-06 00:34:40.568986 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:34:40.568993 | orchestrator | 2026-01-06 00:34:40.568999 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-01-06 00:34:40.569004 | orchestrator | Tuesday 06 January 2026 00:34:37 +0000 (0:00:01.097) 0:00:22.495 ******* 2026-01-06 00:34:40.569009 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.569015 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.569039 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.569044 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.569050 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.569055 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.569060 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.569066 | orchestrator | 2026-01-06 00:34:40.569071 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2026-01-06 00:34:40.569082 | orchestrator | Tuesday 06 January 2026 00:34:38 +0000 (0:00:01.146) 0:00:23.641 ******* 2026-01-06 00:34:40.569087 | orchestrator | ok: [testbed-manager] 2026-01-06 00:34:40.569091 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:34:40.569096 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:34:40.569100 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:34:40.569105 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:34:40.569109 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:34:40.569114 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:34:40.569118 | orchestrator | 2026-01-06 00:34:40.569123 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-01-06 00:34:40.569127 | orchestrator | Tuesday 06 January 2026 00:34:39 +0000 (0:00:00.617) 0:00:24.258 ******* 2026-01-06 00:34:40.569132 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569136 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569141 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569145 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569150 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569154 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569162 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569167 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569172 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569176 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569181 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569185 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2026-01-06 00:34:40.569190 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569194 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2026-01-06 00:34:40.569199 | orchestrator | 2026-01-06 00:34:40.569206 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2026-01-06 00:34:58.018293 | orchestrator | Tuesday 06 January 2026 00:34:40 +0000 (0:00:01.122) 0:00:25.381 ******* 2026-01-06 00:34:58.018428 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:34:58.018450 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:34:58.018464 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:34:58.018477 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:34:58.018489 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:34:58.018497 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:34:58.018505 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:34:58.018512 | orchestrator | 2026-01-06 00:34:58.018521 | orchestrator | TASK [osism.commons.network : Include vxlan interfaces] ************************ 2026-01-06 00:34:58.018528 | orchestrator | Tuesday 06 January 2026 00:34:41 +0000 (0:00:00.595) 0:00:25.976 ******* 2026-01-06 00:34:58.018538 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/vxlan-interfaces.yml for testbed-node-0, testbed-manager, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-5, testbed-node-4 2026-01-06 00:34:58.018548 | orchestrator | 2026-01-06 00:34:58.018555 | orchestrator | TASK [osism.commons.network : Create systemd networkd netdev files] ************ 2026-01-06 00:34:58.018563 | orchestrator | Tuesday 06 January 2026 00:34:46 +0000 (0:00:04.978) 0:00:30.954 ******* 2026-01-06 00:34:58.018572 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018580 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018615 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018623 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018630 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018638 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018646 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018653 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018660 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018683 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018697 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018721 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018729 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018736 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018743 | orchestrator | 2026-01-06 00:34:58.018750 | orchestrator | TASK [osism.commons.network : Create systemd networkd network files] *********** 2026-01-06 00:34:58.018758 | orchestrator | Tuesday 06 January 2026 00:34:52 +0000 (0:00:06.085) 0:00:37.040 ******* 2026-01-06 00:34:58.018772 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018780 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018787 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018795 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018802 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018811 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018820 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018829 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2026-01-06 00:34:58.018838 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018847 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018873 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018881 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:34:58.018894 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:35:11.959112 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2026-01-06 00:35:11.959242 | orchestrator | 2026-01-06 00:35:11.959265 | orchestrator | TASK [osism.commons.network : Include networkd cleanup tasks] ****************** 2026-01-06 00:35:11.959281 | orchestrator | Tuesday 06 January 2026 00:34:57 +0000 (0:00:05.793) 0:00:42.833 ******* 2026-01-06 00:35:11.959296 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-networkd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:35:11.959310 | orchestrator | 2026-01-06 00:35:11.959322 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-01-06 00:35:11.959335 | orchestrator | Tuesday 06 January 2026 00:34:59 +0000 (0:00:01.113) 0:00:43.946 ******* 2026-01-06 00:35:11.959348 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:11.959362 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:35:11.959375 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:35:11.959389 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:35:11.959403 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:35:11.959415 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:35:11.959428 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:35:11.959442 | orchestrator | 2026-01-06 00:35:11.959456 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-01-06 00:35:11.959469 | orchestrator | Tuesday 06 January 2026 00:35:00 +0000 (0:00:01.032) 0:00:44.979 ******* 2026-01-06 00:35:11.959483 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959498 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959511 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959524 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959537 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959550 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959563 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959575 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959589 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.959603 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959616 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959630 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959643 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959656 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.959669 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959683 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959697 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959710 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959724 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.959740 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959754 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959768 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959781 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959795 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.959823 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959851 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959865 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959878 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959890 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.959903 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.959916 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.network)  2026-01-06 00:35:11.959930 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.network)  2026-01-06 00:35:11.959943 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-01-06 00:35:11.959956 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-01-06 00:35:11.959969 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.959982 | orchestrator | 2026-01-06 00:35:11.959995 | orchestrator | TASK [osism.commons.network : Include network extra init] ********************** 2026-01-06 00:35:11.960092 | orchestrator | Tuesday 06 January 2026 00:35:01 +0000 (0:00:00.990) 0:00:45.969 ******* 2026-01-06 00:35:11.960110 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/network-extra-init.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:35:11.960124 | orchestrator | 2026-01-06 00:35:11.960137 | orchestrator | TASK [osism.commons.network : Install required packages for network-extra-init] *** 2026-01-06 00:35:11.960149 | orchestrator | Tuesday 06 January 2026 00:35:02 +0000 (0:00:01.361) 0:00:47.331 ******* 2026-01-06 00:35:11.960161 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.960174 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.960187 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.960201 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.960214 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.960227 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.960240 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.960254 | orchestrator | 2026-01-06 00:35:11.960268 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init script] **************** 2026-01-06 00:35:11.960281 | orchestrator | Tuesday 06 January 2026 00:35:03 +0000 (0:00:00.648) 0:00:47.980 ******* 2026-01-06 00:35:11.960294 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.960306 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.960319 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.960331 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.960344 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.960357 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.960371 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.960384 | orchestrator | 2026-01-06 00:35:11.960397 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init systemd service] ******* 2026-01-06 00:35:11.960409 | orchestrator | Tuesday 06 January 2026 00:35:03 +0000 (0:00:00.824) 0:00:48.804 ******* 2026-01-06 00:35:11.960422 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.960436 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.960450 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.960464 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.960477 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.960489 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.960498 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.960509 | orchestrator | 2026-01-06 00:35:11.960521 | orchestrator | TASK [osism.commons.network : Enable and start network-extra-init service] ***** 2026-01-06 00:35:11.960532 | orchestrator | Tuesday 06 January 2026 00:35:04 +0000 (0:00:00.617) 0:00:49.422 ******* 2026-01-06 00:35:11.960555 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.960566 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.960577 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.960587 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.960599 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.960611 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.960622 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.960633 | orchestrator | 2026-01-06 00:35:11.960645 | orchestrator | TASK [osism.commons.network : Disable and stop network-extra-init service] ***** 2026-01-06 00:35:11.960657 | orchestrator | Tuesday 06 January 2026 00:35:05 +0000 (0:00:00.809) 0:00:50.231 ******* 2026-01-06 00:35:11.960667 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:35:11.960678 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:35:11.960688 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:35:11.960697 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:35:11.960707 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:35:11.960717 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:11.960726 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:35:11.960736 | orchestrator | 2026-01-06 00:35:11.960745 | orchestrator | TASK [osism.commons.network : Remove network-extra-init systemd service] ******* 2026-01-06 00:35:11.960755 | orchestrator | Tuesday 06 January 2026 00:35:06 +0000 (0:00:01.447) 0:00:51.678 ******* 2026-01-06 00:35:11.960765 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:11.960774 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:35:11.960784 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:35:11.960793 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:35:11.960803 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:35:11.960812 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:35:11.960823 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:35:11.960834 | orchestrator | 2026-01-06 00:35:11.960845 | orchestrator | TASK [osism.commons.network : Remove network-extra-init script] **************** 2026-01-06 00:35:11.960856 | orchestrator | Tuesday 06 January 2026 00:35:08 +0000 (0:00:01.387) 0:00:53.066 ******* 2026-01-06 00:35:11.960868 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:11.960880 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:35:11.960889 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:35:11.960901 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:35:11.960913 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:35:11.960924 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:35:11.960933 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:35:11.960943 | orchestrator | 2026-01-06 00:35:11.960953 | orchestrator | RUNNING HANDLER [osism.commons.network : Reload systemd-networkd] ************** 2026-01-06 00:35:11.960971 | orchestrator | Tuesday 06 January 2026 00:35:10 +0000 (0:00:02.260) 0:00:55.326 ******* 2026-01-06 00:35:11.960981 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.960991 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.961018 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.961030 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.961041 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.961051 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.961061 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.961071 | orchestrator | 2026-01-06 00:35:11.961081 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2026-01-06 00:35:11.961091 | orchestrator | Tuesday 06 January 2026 00:35:11 +0000 (0:00:00.651) 0:00:55.978 ******* 2026-01-06 00:35:11.961100 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:35:11.961110 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:35:11.961120 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:35:11.961130 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:35:11.961140 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:35:11.961152 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:35:11.961162 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:35:11.961173 | orchestrator | 2026-01-06 00:35:11.961183 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:35:12.442466 | orchestrator | testbed-manager : ok=25  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-01-06 00:35:12.442577 | orchestrator | testbed-node-0 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442593 | orchestrator | testbed-node-1 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442606 | orchestrator | testbed-node-2 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442617 | orchestrator | testbed-node-3 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442628 | orchestrator | testbed-node-4 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442639 | orchestrator | testbed-node-5 : ok=24  changed=5  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:35:12.442650 | orchestrator | 2026-01-06 00:35:12.442662 | orchestrator | 2026-01-06 00:35:12.442674 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:35:12.442686 | orchestrator | Tuesday 06 January 2026 00:35:11 +0000 (0:00:00.801) 0:00:56.779 ******* 2026-01-06 00:35:12.442697 | orchestrator | =============================================================================== 2026-01-06 00:35:12.442708 | orchestrator | osism.commons.network : Create systemd networkd netdev files ------------ 6.09s 2026-01-06 00:35:12.442719 | orchestrator | osism.commons.network : Create systemd networkd network files ----------- 5.79s 2026-01-06 00:35:12.442730 | orchestrator | osism.commons.network : Include vxlan interfaces ------------------------ 4.98s 2026-01-06 00:35:12.442741 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 3.47s 2026-01-06 00:35:12.442751 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.38s 2026-01-06 00:35:12.442762 | orchestrator | osism.commons.network : Remove network-extra-init script ---------------- 2.26s 2026-01-06 00:35:12.442773 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.24s 2026-01-06 00:35:12.442784 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.98s 2026-01-06 00:35:12.442795 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.85s 2026-01-06 00:35:12.442806 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.80s 2026-01-06 00:35:12.442817 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.61s 2026-01-06 00:35:12.442828 | orchestrator | osism.commons.network : Disable and stop network-extra-init service ----- 1.45s 2026-01-06 00:35:12.442839 | orchestrator | osism.commons.network : Remove network-extra-init systemd service ------- 1.39s 2026-01-06 00:35:12.442850 | orchestrator | osism.commons.network : Include network extra init ---------------------- 1.36s 2026-01-06 00:35:12.442860 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.25s 2026-01-06 00:35:12.442871 | orchestrator | osism.commons.network : Check if path for interface file exists --------- 1.21s 2026-01-06 00:35:12.442882 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.15s 2026-01-06 00:35:12.442893 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.12s 2026-01-06 00:35:12.442908 | orchestrator | osism.commons.network : Create required directories --------------------- 1.11s 2026-01-06 00:35:12.442919 | orchestrator | osism.commons.network : Include networkd cleanup tasks ------------------ 1.11s 2026-01-06 00:35:12.838152 | orchestrator | + osism apply wireguard 2026-01-06 00:35:24.922124 | orchestrator | 2026-01-06 00:35:24 | INFO  | Task c11ad5bc-37d0-4987-bc4f-7c3d15dd44c3 (wireguard) was prepared for execution. 2026-01-06 00:35:24.922295 | orchestrator | 2026-01-06 00:35:24 | INFO  | It takes a moment until task c11ad5bc-37d0-4987-bc4f-7c3d15dd44c3 (wireguard) has been started and output is visible here. 2026-01-06 00:35:45.158529 | orchestrator | 2026-01-06 00:35:45.158661 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2026-01-06 00:35:45.158678 | orchestrator | 2026-01-06 00:35:45.158690 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2026-01-06 00:35:45.158702 | orchestrator | Tuesday 06 January 2026 00:35:29 +0000 (0:00:00.199) 0:00:00.199 ******* 2026-01-06 00:35:45.158714 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:45.158726 | orchestrator | 2026-01-06 00:35:45.158737 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2026-01-06 00:35:45.158811 | orchestrator | Tuesday 06 January 2026 00:35:30 +0000 (0:00:01.360) 0:00:01.560 ******* 2026-01-06 00:35:45.158824 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.158835 | orchestrator | 2026-01-06 00:35:45.158846 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2026-01-06 00:35:45.158858 | orchestrator | Tuesday 06 January 2026 00:35:37 +0000 (0:00:06.525) 0:00:08.086 ******* 2026-01-06 00:35:45.158869 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.158880 | orchestrator | 2026-01-06 00:35:45.158891 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2026-01-06 00:35:45.158902 | orchestrator | Tuesday 06 January 2026 00:35:37 +0000 (0:00:00.611) 0:00:08.697 ******* 2026-01-06 00:35:45.158913 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.158923 | orchestrator | 2026-01-06 00:35:45.158934 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2026-01-06 00:35:45.158945 | orchestrator | Tuesday 06 January 2026 00:35:38 +0000 (0:00:00.460) 0:00:09.158 ******* 2026-01-06 00:35:45.158956 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:45.158967 | orchestrator | 2026-01-06 00:35:45.159005 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2026-01-06 00:35:45.159018 | orchestrator | Tuesday 06 January 2026 00:35:38 +0000 (0:00:00.703) 0:00:09.861 ******* 2026-01-06 00:35:45.159028 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:45.159039 | orchestrator | 2026-01-06 00:35:45.159052 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2026-01-06 00:35:45.159066 | orchestrator | Tuesday 06 January 2026 00:35:39 +0000 (0:00:00.456) 0:00:10.317 ******* 2026-01-06 00:35:45.159079 | orchestrator | ok: [testbed-manager] 2026-01-06 00:35:45.159091 | orchestrator | 2026-01-06 00:35:45.159104 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2026-01-06 00:35:45.159116 | orchestrator | Tuesday 06 January 2026 00:35:39 +0000 (0:00:00.437) 0:00:10.755 ******* 2026-01-06 00:35:45.159129 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.159142 | orchestrator | 2026-01-06 00:35:45.159154 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2026-01-06 00:35:45.159168 | orchestrator | Tuesday 06 January 2026 00:35:40 +0000 (0:00:01.291) 0:00:12.047 ******* 2026-01-06 00:35:45.159181 | orchestrator | changed: [testbed-manager] => (item=None) 2026-01-06 00:35:45.159193 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.159206 | orchestrator | 2026-01-06 00:35:45.159219 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2026-01-06 00:35:45.159232 | orchestrator | Tuesday 06 January 2026 00:35:41 +0000 (0:00:01.004) 0:00:13.051 ******* 2026-01-06 00:35:45.159244 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.159257 | orchestrator | 2026-01-06 00:35:45.159269 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2026-01-06 00:35:45.159282 | orchestrator | Tuesday 06 January 2026 00:35:43 +0000 (0:00:01.779) 0:00:14.831 ******* 2026-01-06 00:35:45.159295 | orchestrator | changed: [testbed-manager] 2026-01-06 00:35:45.159308 | orchestrator | 2026-01-06 00:35:45.159321 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:35:45.159335 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:35:45.159381 | orchestrator | 2026-01-06 00:35:45.159394 | orchestrator | 2026-01-06 00:35:45.159434 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:35:45.159446 | orchestrator | Tuesday 06 January 2026 00:35:44 +0000 (0:00:00.951) 0:00:15.782 ******* 2026-01-06 00:35:45.159457 | orchestrator | =============================================================================== 2026-01-06 00:35:45.159468 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 6.53s 2026-01-06 00:35:45.159479 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.78s 2026-01-06 00:35:45.159489 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.36s 2026-01-06 00:35:45.159500 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.29s 2026-01-06 00:35:45.159511 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 1.00s 2026-01-06 00:35:45.159522 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.95s 2026-01-06 00:35:45.159532 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.70s 2026-01-06 00:35:45.159543 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.61s 2026-01-06 00:35:45.159554 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.46s 2026-01-06 00:35:45.159564 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.46s 2026-01-06 00:35:45.159575 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.44s 2026-01-06 00:35:45.494884 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2026-01-06 00:35:45.538290 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2026-01-06 00:35:45.538388 | orchestrator | Dload Upload Total Spent Left Speed 2026-01-06 00:35:45.613492 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 15 100 15 0 0 198 0 --:--:-- --:--:-- --:--:-- 200 2026-01-06 00:35:45.630470 | orchestrator | + osism apply --environment custom workarounds 2026-01-06 00:35:47.802470 | orchestrator | 2026-01-06 00:35:47 | INFO  | Trying to run play workarounds in environment custom 2026-01-06 00:35:58.116668 | orchestrator | 2026-01-06 00:35:58 | INFO  | Task 19b9be03-1cd7-4621-bc75-af5a7c1eea63 (workarounds) was prepared for execution. 2026-01-06 00:35:58.116795 | orchestrator | 2026-01-06 00:35:58 | INFO  | It takes a moment until task 19b9be03-1cd7-4621-bc75-af5a7c1eea63 (workarounds) has been started and output is visible here. 2026-01-06 00:36:25.009001 | orchestrator | 2026-01-06 00:36:25.009088 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:36:25.009098 | orchestrator | 2026-01-06 00:36:25.009105 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2026-01-06 00:36:25.009112 | orchestrator | Tuesday 06 January 2026 00:36:02 +0000 (0:00:00.148) 0:00:00.148 ******* 2026-01-06 00:36:25.009119 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009126 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009133 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009140 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009147 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009154 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009161 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2026-01-06 00:36:25.009167 | orchestrator | 2026-01-06 00:36:25.009174 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2026-01-06 00:36:25.009199 | orchestrator | 2026-01-06 00:36:25.009206 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-01-06 00:36:25.009213 | orchestrator | Tuesday 06 January 2026 00:36:03 +0000 (0:00:00.882) 0:00:01.030 ******* 2026-01-06 00:36:25.009220 | orchestrator | ok: [testbed-manager] 2026-01-06 00:36:25.009227 | orchestrator | 2026-01-06 00:36:25.009234 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2026-01-06 00:36:25.009241 | orchestrator | 2026-01-06 00:36:25.009248 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-01-06 00:36:25.009254 | orchestrator | Tuesday 06 January 2026 00:36:06 +0000 (0:00:02.651) 0:00:03.681 ******* 2026-01-06 00:36:25.009261 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:36:25.009268 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:36:25.009274 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:36:25.009281 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:36:25.009287 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:36:25.009293 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:36:25.009299 | orchestrator | 2026-01-06 00:36:25.009305 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2026-01-06 00:36:25.009312 | orchestrator | 2026-01-06 00:36:25.009319 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2026-01-06 00:36:25.009326 | orchestrator | Tuesday 06 January 2026 00:36:08 +0000 (0:00:01.924) 0:00:05.606 ******* 2026-01-06 00:36:25.009333 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009340 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009346 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009352 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009358 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009364 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-01-06 00:36:25.009370 | orchestrator | 2026-01-06 00:36:25.009376 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2026-01-06 00:36:25.009382 | orchestrator | Tuesday 06 January 2026 00:36:09 +0000 (0:00:01.583) 0:00:07.190 ******* 2026-01-06 00:36:25.009389 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:36:25.009395 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:36:25.009402 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:36:25.009408 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:36:25.009414 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:36:25.009420 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:36:25.009426 | orchestrator | 2026-01-06 00:36:25.009432 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2026-01-06 00:36:25.009438 | orchestrator | Tuesday 06 January 2026 00:36:13 +0000 (0:00:04.066) 0:00:11.256 ******* 2026-01-06 00:36:25.009444 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:36:25.009450 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:36:25.009456 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:36:25.009462 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:36:25.009468 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:36:25.009475 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:36:25.009481 | orchestrator | 2026-01-06 00:36:25.009488 | orchestrator | PLAY [Add a workaround service] ************************************************ 2026-01-06 00:36:25.009493 | orchestrator | 2026-01-06 00:36:25.009499 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2026-01-06 00:36:25.009505 | orchestrator | Tuesday 06 January 2026 00:36:14 +0000 (0:00:00.773) 0:00:12.029 ******* 2026-01-06 00:36:25.009511 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:36:25.009518 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:36:25.009529 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:36:25.009535 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:36:25.009561 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:36:25.009568 | orchestrator | changed: [testbed-manager] 2026-01-06 00:36:25.009574 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:36:25.009580 | orchestrator | 2026-01-06 00:36:25.009586 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2026-01-06 00:36:25.009592 | orchestrator | Tuesday 06 January 2026 00:36:16 +0000 (0:00:01.642) 0:00:13.672 ******* 2026-01-06 00:36:25.009598 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:36:25.009604 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:36:25.009610 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:36:25.009615 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:36:25.009621 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:36:25.009628 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:36:25.009648 | orchestrator | changed: [testbed-manager] 2026-01-06 00:36:25.009655 | orchestrator | 2026-01-06 00:36:25.009661 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2026-01-06 00:36:25.009667 | orchestrator | Tuesday 06 January 2026 00:36:17 +0000 (0:00:01.691) 0:00:15.363 ******* 2026-01-06 00:36:25.009674 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:36:25.009680 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:36:25.009686 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:36:25.009692 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:36:25.009698 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:36:25.009704 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:36:25.009710 | orchestrator | ok: [testbed-manager] 2026-01-06 00:36:25.009717 | orchestrator | 2026-01-06 00:36:25.009724 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2026-01-06 00:36:25.009730 | orchestrator | Tuesday 06 January 2026 00:36:19 +0000 (0:00:01.590) 0:00:16.954 ******* 2026-01-06 00:36:25.009736 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:36:25.009742 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:36:25.009748 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:36:25.009754 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:36:25.009760 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:36:25.009766 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:36:25.009772 | orchestrator | changed: [testbed-manager] 2026-01-06 00:36:25.009778 | orchestrator | 2026-01-06 00:36:25.009784 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2026-01-06 00:36:25.009790 | orchestrator | Tuesday 06 January 2026 00:36:21 +0000 (0:00:01.926) 0:00:18.880 ******* 2026-01-06 00:36:25.009797 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:36:25.009804 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:36:25.009810 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:36:25.009816 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:36:25.009822 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:36:25.009828 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:36:25.009834 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:36:25.009847 | orchestrator | 2026-01-06 00:36:25.009853 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2026-01-06 00:36:25.009858 | orchestrator | 2026-01-06 00:36:25.009864 | orchestrator | TASK [Install python3-docker] ************************************************** 2026-01-06 00:36:25.009870 | orchestrator | Tuesday 06 January 2026 00:36:21 +0000 (0:00:00.618) 0:00:19.499 ******* 2026-01-06 00:36:25.009876 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:36:25.009882 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:36:25.009888 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:36:25.009893 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:36:25.009899 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:36:25.009904 | orchestrator | ok: [testbed-manager] 2026-01-06 00:36:25.009910 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:36:25.009915 | orchestrator | 2026-01-06 00:36:25.009921 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:36:25.009933 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:36:25.009940 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009945 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009964 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009970 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009976 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009982 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:25.009988 | orchestrator | 2026-01-06 00:36:25.009994 | orchestrator | 2026-01-06 00:36:25.010000 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:36:25.010007 | orchestrator | Tuesday 06 January 2026 00:36:24 +0000 (0:00:03.055) 0:00:22.555 ******* 2026-01-06 00:36:25.010013 | orchestrator | =============================================================================== 2026-01-06 00:36:25.010056 | orchestrator | Run update-ca-certificates ---------------------------------------------- 4.07s 2026-01-06 00:36:25.010062 | orchestrator | Install python3-docker -------------------------------------------------- 3.06s 2026-01-06 00:36:25.010068 | orchestrator | Apply netplan configuration --------------------------------------------- 2.65s 2026-01-06 00:36:25.010073 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 1.93s 2026-01-06 00:36:25.010079 | orchestrator | Apply netplan configuration --------------------------------------------- 1.92s 2026-01-06 00:36:25.010090 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.69s 2026-01-06 00:36:25.010096 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.64s 2026-01-06 00:36:25.010102 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.59s 2026-01-06 00:36:25.010108 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.58s 2026-01-06 00:36:25.010114 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.88s 2026-01-06 00:36:25.010119 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.77s 2026-01-06 00:36:25.010131 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.62s 2026-01-06 00:36:25.828018 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2026-01-06 00:36:38.101540 | orchestrator | 2026-01-06 00:36:38 | INFO  | Task 37b6cbb0-ed14-4f08-8ee6-16bc69ec4cd8 (reboot) was prepared for execution. 2026-01-06 00:36:38.101635 | orchestrator | 2026-01-06 00:36:38 | INFO  | It takes a moment until task 37b6cbb0-ed14-4f08-8ee6-16bc69ec4cd8 (reboot) has been started and output is visible here. 2026-01-06 00:36:48.781584 | orchestrator | 2026-01-06 00:36:48.781713 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.781729 | orchestrator | 2026-01-06 00:36:48.781741 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.781752 | orchestrator | Tuesday 06 January 2026 00:36:42 +0000 (0:00:00.251) 0:00:00.251 ******* 2026-01-06 00:36:48.781764 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:36:48.781776 | orchestrator | 2026-01-06 00:36:48.781787 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.781827 | orchestrator | Tuesday 06 January 2026 00:36:42 +0000 (0:00:00.114) 0:00:00.365 ******* 2026-01-06 00:36:48.781838 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:36:48.781849 | orchestrator | 2026-01-06 00:36:48.781860 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.781871 | orchestrator | Tuesday 06 January 2026 00:36:43 +0000 (0:00:01.002) 0:00:01.368 ******* 2026-01-06 00:36:48.781882 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:36:48.781893 | orchestrator | 2026-01-06 00:36:48.781903 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.781914 | orchestrator | 2026-01-06 00:36:48.781925 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.782085 | orchestrator | Tuesday 06 January 2026 00:36:43 +0000 (0:00:00.137) 0:00:01.505 ******* 2026-01-06 00:36:48.782103 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:36:48.782114 | orchestrator | 2026-01-06 00:36:48.782125 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.782136 | orchestrator | Tuesday 06 January 2026 00:36:43 +0000 (0:00:00.110) 0:00:01.615 ******* 2026-01-06 00:36:48.782147 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:36:48.782158 | orchestrator | 2026-01-06 00:36:48.782169 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.782180 | orchestrator | Tuesday 06 January 2026 00:36:44 +0000 (0:00:00.664) 0:00:02.279 ******* 2026-01-06 00:36:48.782190 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:36:48.782201 | orchestrator | 2026-01-06 00:36:48.782212 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.782222 | orchestrator | 2026-01-06 00:36:48.782233 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.782244 | orchestrator | Tuesday 06 January 2026 00:36:44 +0000 (0:00:00.110) 0:00:02.390 ******* 2026-01-06 00:36:48.782255 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:36:48.782265 | orchestrator | 2026-01-06 00:36:48.782276 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.782286 | orchestrator | Tuesday 06 January 2026 00:36:44 +0000 (0:00:00.253) 0:00:02.643 ******* 2026-01-06 00:36:48.782297 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:36:48.782308 | orchestrator | 2026-01-06 00:36:48.782318 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.782329 | orchestrator | Tuesday 06 January 2026 00:36:45 +0000 (0:00:00.694) 0:00:03.338 ******* 2026-01-06 00:36:48.782340 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:36:48.782350 | orchestrator | 2026-01-06 00:36:48.782361 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.782372 | orchestrator | 2026-01-06 00:36:48.782382 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.782393 | orchestrator | Tuesday 06 January 2026 00:36:45 +0000 (0:00:00.119) 0:00:03.458 ******* 2026-01-06 00:36:48.782403 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:36:48.782414 | orchestrator | 2026-01-06 00:36:48.782425 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.782435 | orchestrator | Tuesday 06 January 2026 00:36:45 +0000 (0:00:00.106) 0:00:03.564 ******* 2026-01-06 00:36:48.782446 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:36:48.782456 | orchestrator | 2026-01-06 00:36:48.782467 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.782478 | orchestrator | Tuesday 06 January 2026 00:36:46 +0000 (0:00:00.686) 0:00:04.251 ******* 2026-01-06 00:36:48.782489 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:36:48.782500 | orchestrator | 2026-01-06 00:36:48.782510 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.782521 | orchestrator | 2026-01-06 00:36:48.782532 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.782553 | orchestrator | Tuesday 06 January 2026 00:36:46 +0000 (0:00:00.118) 0:00:04.369 ******* 2026-01-06 00:36:48.782564 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:36:48.782575 | orchestrator | 2026-01-06 00:36:48.782585 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.782613 | orchestrator | Tuesday 06 January 2026 00:36:46 +0000 (0:00:00.118) 0:00:04.487 ******* 2026-01-06 00:36:48.782624 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:36:48.782635 | orchestrator | 2026-01-06 00:36:48.782645 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.782656 | orchestrator | Tuesday 06 January 2026 00:36:47 +0000 (0:00:00.698) 0:00:05.186 ******* 2026-01-06 00:36:48.782667 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:36:48.782677 | orchestrator | 2026-01-06 00:36:48.782688 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-01-06 00:36:48.782699 | orchestrator | 2026-01-06 00:36:48.782709 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-01-06 00:36:48.782720 | orchestrator | Tuesday 06 January 2026 00:36:47 +0000 (0:00:00.114) 0:00:05.301 ******* 2026-01-06 00:36:48.782731 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:36:48.782741 | orchestrator | 2026-01-06 00:36:48.782752 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-01-06 00:36:48.782763 | orchestrator | Tuesday 06 January 2026 00:36:47 +0000 (0:00:00.116) 0:00:05.417 ******* 2026-01-06 00:36:48.782773 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:36:48.782784 | orchestrator | 2026-01-06 00:36:48.782794 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-01-06 00:36:48.782805 | orchestrator | Tuesday 06 January 2026 00:36:48 +0000 (0:00:00.702) 0:00:06.120 ******* 2026-01-06 00:36:48.782835 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:36:48.782847 | orchestrator | 2026-01-06 00:36:48.782857 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:36:48.782869 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782881 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782892 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782903 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782913 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782924 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:36:48.782973 | orchestrator | 2026-01-06 00:36:48.782987 | orchestrator | 2026-01-06 00:36:48.782998 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:36:48.783009 | orchestrator | Tuesday 06 January 2026 00:36:48 +0000 (0:00:00.044) 0:00:06.164 ******* 2026-01-06 00:36:48.783020 | orchestrator | =============================================================================== 2026-01-06 00:36:48.783031 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 4.45s 2026-01-06 00:36:48.783042 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.82s 2026-01-06 00:36:48.783053 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.64s 2026-01-06 00:36:49.173412 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2026-01-06 00:37:01.383559 | orchestrator | 2026-01-06 00:37:01 | INFO  | Task c96532ee-a8ea-4daa-b273-1870a3142cc0 (wait-for-connection) was prepared for execution. 2026-01-06 00:37:01.383707 | orchestrator | 2026-01-06 00:37:01 | INFO  | It takes a moment until task c96532ee-a8ea-4daa-b273-1870a3142cc0 (wait-for-connection) has been started and output is visible here. 2026-01-06 00:37:17.900316 | orchestrator | 2026-01-06 00:37:17.900441 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2026-01-06 00:37:17.900456 | orchestrator | 2026-01-06 00:37:17.900466 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2026-01-06 00:37:17.900476 | orchestrator | Tuesday 06 January 2026 00:37:05 +0000 (0:00:00.240) 0:00:00.240 ******* 2026-01-06 00:37:17.900485 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:37:17.900495 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:37:17.900503 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:37:17.900512 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:37:17.900521 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:37:17.900529 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:37:17.900538 | orchestrator | 2026-01-06 00:37:17.900547 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:37:17.900557 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900567 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900576 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900585 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900616 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900625 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:17.900634 | orchestrator | 2026-01-06 00:37:17.900643 | orchestrator | 2026-01-06 00:37:17.900652 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:37:17.900660 | orchestrator | Tuesday 06 January 2026 00:37:17 +0000 (0:00:11.641) 0:00:11.881 ******* 2026-01-06 00:37:17.900669 | orchestrator | =============================================================================== 2026-01-06 00:37:17.900678 | orchestrator | Wait until remote system is reachable ---------------------------------- 11.64s 2026-01-06 00:37:18.306395 | orchestrator | + osism apply hddtemp 2026-01-06 00:37:30.511335 | orchestrator | 2026-01-06 00:37:30 | INFO  | Task 7aff339f-799d-4ae9-8df3-79dc975168ae (hddtemp) was prepared for execution. 2026-01-06 00:37:30.511460 | orchestrator | 2026-01-06 00:37:30 | INFO  | It takes a moment until task 7aff339f-799d-4ae9-8df3-79dc975168ae (hddtemp) has been started and output is visible here. 2026-01-06 00:37:59.907927 | orchestrator | 2026-01-06 00:37:59.908052 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2026-01-06 00:37:59.908068 | orchestrator | 2026-01-06 00:37:59.908080 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2026-01-06 00:37:59.908092 | orchestrator | Tuesday 06 January 2026 00:37:35 +0000 (0:00:00.262) 0:00:00.262 ******* 2026-01-06 00:37:59.908104 | orchestrator | ok: [testbed-manager] 2026-01-06 00:37:59.908115 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:37:59.908126 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:37:59.908138 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:37:59.908149 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:37:59.908159 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:37:59.908170 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:37:59.908181 | orchestrator | 2026-01-06 00:37:59.908192 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2026-01-06 00:37:59.908232 | orchestrator | Tuesday 06 January 2026 00:37:35 +0000 (0:00:00.736) 0:00:00.998 ******* 2026-01-06 00:37:59.908246 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:37:59.908259 | orchestrator | 2026-01-06 00:37:59.908270 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2026-01-06 00:37:59.908281 | orchestrator | Tuesday 06 January 2026 00:37:37 +0000 (0:00:01.263) 0:00:02.262 ******* 2026-01-06 00:37:59.908292 | orchestrator | ok: [testbed-manager] 2026-01-06 00:37:59.908303 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:37:59.908314 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:37:59.908324 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:37:59.908335 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:37:59.908346 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:37:59.908357 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:37:59.908367 | orchestrator | 2026-01-06 00:37:59.908378 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2026-01-06 00:37:59.908389 | orchestrator | Tuesday 06 January 2026 00:37:39 +0000 (0:00:02.094) 0:00:04.356 ******* 2026-01-06 00:37:59.908400 | orchestrator | changed: [testbed-manager] 2026-01-06 00:37:59.908412 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:37:59.908425 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:37:59.908437 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:37:59.908450 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:37:59.908463 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:37:59.908475 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:37:59.908488 | orchestrator | 2026-01-06 00:37:59.908502 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2026-01-06 00:37:59.908513 | orchestrator | Tuesday 06 January 2026 00:37:40 +0000 (0:00:01.079) 0:00:05.436 ******* 2026-01-06 00:37:59.908524 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:37:59.908535 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:37:59.908546 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:37:59.908556 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:37:59.908567 | orchestrator | ok: [testbed-manager] 2026-01-06 00:37:59.908578 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:37:59.908589 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:37:59.908599 | orchestrator | 2026-01-06 00:37:59.908610 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2026-01-06 00:37:59.908621 | orchestrator | Tuesday 06 January 2026 00:37:41 +0000 (0:00:01.154) 0:00:06.590 ******* 2026-01-06 00:37:59.908632 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:37:59.908643 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:37:59.908654 | orchestrator | changed: [testbed-manager] 2026-01-06 00:37:59.908664 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:37:59.908675 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:37:59.908686 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:37:59.908697 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:37:59.908707 | orchestrator | 2026-01-06 00:37:59.908718 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2026-01-06 00:37:59.908729 | orchestrator | Tuesday 06 January 2026 00:37:42 +0000 (0:00:00.792) 0:00:07.383 ******* 2026-01-06 00:37:59.908739 | orchestrator | changed: [testbed-manager] 2026-01-06 00:37:59.908751 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:37:59.908762 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:37:59.908773 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:37:59.908783 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:37:59.908794 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:37:59.908805 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:37:59.908815 | orchestrator | 2026-01-06 00:37:59.908826 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2026-01-06 00:37:59.908863 | orchestrator | Tuesday 06 January 2026 00:37:55 +0000 (0:00:13.496) 0:00:20.879 ******* 2026-01-06 00:37:59.908906 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:37:59.908918 | orchestrator | 2026-01-06 00:37:59.908930 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2026-01-06 00:37:59.908940 | orchestrator | Tuesday 06 January 2026 00:37:56 +0000 (0:00:01.127) 0:00:22.007 ******* 2026-01-06 00:37:59.908951 | orchestrator | changed: [testbed-manager] 2026-01-06 00:37:59.908962 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:37:59.908972 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:37:59.908983 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:37:59.908993 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:37:59.909004 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:37:59.909015 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:37:59.909025 | orchestrator | 2026-01-06 00:37:59.909036 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:37:59.909047 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:37:59.909078 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909090 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909100 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909111 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909122 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909133 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:37:59.909144 | orchestrator | 2026-01-06 00:37:59.909154 | orchestrator | 2026-01-06 00:37:59.909165 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:37:59.909176 | orchestrator | Tuesday 06 January 2026 00:37:59 +0000 (0:00:02.848) 0:00:24.855 ******* 2026-01-06 00:37:59.909187 | orchestrator | =============================================================================== 2026-01-06 00:37:59.909198 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.50s 2026-01-06 00:37:59.909209 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 2.85s 2026-01-06 00:37:59.909220 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.09s 2026-01-06 00:37:59.909230 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.26s 2026-01-06 00:37:59.909241 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.15s 2026-01-06 00:37:59.909252 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.13s 2026-01-06 00:37:59.909263 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 1.08s 2026-01-06 00:37:59.909273 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.79s 2026-01-06 00:37:59.909284 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.74s 2026-01-06 00:38:00.144350 | orchestrator | ++ semver latest 7.1.1 2026-01-06 00:38:00.198981 | orchestrator | + [[ -1 -ge 0 ]] 2026-01-06 00:38:00.199085 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2026-01-06 00:38:00.199100 | orchestrator | + sudo systemctl restart manager.service 2026-01-06 00:38:13.178129 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-01-06 00:38:13.178240 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-01-06 00:38:13.178254 | orchestrator | + local max_attempts=60 2026-01-06 00:38:13.178266 | orchestrator | + local name=ceph-ansible 2026-01-06 00:38:13.178276 | orchestrator | + local attempt_num=1 2026-01-06 00:38:13.178287 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:13.210195 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:13.210271 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:13.210279 | orchestrator | + sleep 5 2026-01-06 00:38:18.214070 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:18.249154 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:18.249259 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:18.249275 | orchestrator | + sleep 5 2026-01-06 00:38:23.252640 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:23.284526 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:23.284633 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:23.284649 | orchestrator | + sleep 5 2026-01-06 00:38:28.288957 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:28.327818 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:28.327925 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:28.327938 | orchestrator | + sleep 5 2026-01-06 00:38:33.333414 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:33.376913 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:33.377034 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:33.377058 | orchestrator | + sleep 5 2026-01-06 00:38:38.381725 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:38.422895 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:38.422993 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:38.423007 | orchestrator | + sleep 5 2026-01-06 00:38:43.427309 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:43.474230 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:43.474326 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:43.474340 | orchestrator | + sleep 5 2026-01-06 00:38:48.477948 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:48.500579 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:48.500682 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:48.500697 | orchestrator | + sleep 5 2026-01-06 00:38:53.504091 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:53.532793 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:53.532898 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:53.532915 | orchestrator | + sleep 5 2026-01-06 00:38:58.536273 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:38:58.578141 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:38:58.578277 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:38:58.578304 | orchestrator | + sleep 5 2026-01-06 00:39:03.583073 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:39:03.620535 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:03.620619 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:39:03.620629 | orchestrator | + sleep 5 2026-01-06 00:39:08.625276 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:39:08.657708 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:08.671618 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:39:08.671693 | orchestrator | + sleep 5 2026-01-06 00:39:13.661921 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:39:13.702361 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:13.702459 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-01-06 00:39:13.702473 | orchestrator | + sleep 5 2026-01-06 00:39:18.708223 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-01-06 00:39:18.750161 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:18.750254 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-01-06 00:39:18.750270 | orchestrator | + local max_attempts=60 2026-01-06 00:39:18.750314 | orchestrator | + local name=kolla-ansible 2026-01-06 00:39:18.750327 | orchestrator | + local attempt_num=1 2026-01-06 00:39:18.751102 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-01-06 00:39:18.788387 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:18.788462 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-01-06 00:39:18.788475 | orchestrator | + local max_attempts=60 2026-01-06 00:39:18.788487 | orchestrator | + local name=osism-ansible 2026-01-06 00:39:18.788498 | orchestrator | + local attempt_num=1 2026-01-06 00:39:18.789078 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-01-06 00:39:18.824280 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-01-06 00:39:18.824373 | orchestrator | + [[ true == \t\r\u\e ]] 2026-01-06 00:39:18.824387 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-01-06 00:39:19.013296 | orchestrator | ARA in ceph-ansible already disabled. 2026-01-06 00:39:19.183235 | orchestrator | ARA in kolla-ansible already disabled. 2026-01-06 00:39:19.339931 | orchestrator | ARA in osism-ansible already disabled. 2026-01-06 00:39:19.484286 | orchestrator | ARA in osism-kubernetes already disabled. 2026-01-06 00:39:19.485599 | orchestrator | + osism apply gather-facts 2026-01-06 00:39:31.827254 | orchestrator | 2026-01-06 00:39:31 | INFO  | Task 426165f0-8706-4bec-9224-47c63f0831b0 (gather-facts) was prepared for execution. 2026-01-06 00:39:31.827377 | orchestrator | 2026-01-06 00:39:31 | INFO  | It takes a moment until task 426165f0-8706-4bec-9224-47c63f0831b0 (gather-facts) has been started and output is visible here. 2026-01-06 00:39:45.926146 | orchestrator | 2026-01-06 00:39:45.926304 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-01-06 00:39:45.926334 | orchestrator | 2026-01-06 00:39:45.926355 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:39:45.926375 | orchestrator | Tuesday 06 January 2026 00:39:36 +0000 (0:00:00.219) 0:00:00.219 ******* 2026-01-06 00:39:45.926394 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:39:45.926413 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:39:45.926431 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:39:45.926449 | orchestrator | ok: [testbed-manager] 2026-01-06 00:39:45.926467 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:39:45.926487 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:39:45.926505 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:39:45.926523 | orchestrator | 2026-01-06 00:39:45.926541 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-01-06 00:39:45.926559 | orchestrator | 2026-01-06 00:39:45.926577 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-01-06 00:39:45.926597 | orchestrator | Tuesday 06 January 2026 00:39:44 +0000 (0:00:08.545) 0:00:08.765 ******* 2026-01-06 00:39:45.926616 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:39:45.926635 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:39:45.926653 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:39:45.926703 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:39:45.926725 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:39:45.926743 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:39:45.926762 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:39:45.926781 | orchestrator | 2026-01-06 00:39:45.926799 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:39:45.926814 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926832 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926852 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926873 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926925 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926939 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926973 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:39:45.926985 | orchestrator | 2026-01-06 00:39:45.926996 | orchestrator | 2026-01-06 00:39:45.927007 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:39:45.927018 | orchestrator | Tuesday 06 January 2026 00:39:45 +0000 (0:00:00.635) 0:00:09.401 ******* 2026-01-06 00:39:45.927028 | orchestrator | =============================================================================== 2026-01-06 00:39:45.927039 | orchestrator | Gathers facts about hosts ----------------------------------------------- 8.55s 2026-01-06 00:39:45.927050 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.64s 2026-01-06 00:39:46.325112 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2026-01-06 00:39:46.339878 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2026-01-06 00:39:46.355010 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2026-01-06 00:39:46.375942 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2026-01-06 00:39:46.388435 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2026-01-06 00:39:46.405877 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2026-01-06 00:39:46.421188 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2026-01-06 00:39:46.436287 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2026-01-06 00:39:46.459476 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2026-01-06 00:39:46.477160 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2026-01-06 00:39:46.491003 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2026-01-06 00:39:46.509544 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2026-01-06 00:39:46.524711 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2026-01-06 00:39:46.545382 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2026-01-06 00:39:46.565265 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2026-01-06 00:39:46.586311 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2026-01-06 00:39:46.607414 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2026-01-06 00:39:46.625009 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2026-01-06 00:39:46.645686 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2026-01-06 00:39:46.664603 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2026-01-06 00:39:46.687420 | orchestrator | + [[ false == \t\r\u\e ]] 2026-01-06 00:39:46.852872 | orchestrator | ok: Runtime: 0:24:52.913808 2026-01-06 00:39:46.954043 | 2026-01-06 00:39:46.954227 | TASK [Deploy services] 2026-01-06 00:39:47.489761 | orchestrator | skipping: Conditional result was False 2026-01-06 00:39:47.509620 | 2026-01-06 00:39:47.509824 | TASK [Deploy in a nutshell] 2026-01-06 00:39:48.254646 | orchestrator | + set -e 2026-01-06 00:39:48.254798 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-01-06 00:39:48.254811 | orchestrator | ++ export INTERACTIVE=false 2026-01-06 00:39:48.254820 | orchestrator | ++ INTERACTIVE=false 2026-01-06 00:39:48.254825 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-01-06 00:39:48.254830 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-01-06 00:39:48.254836 | orchestrator | + source /opt/manager-vars.sh 2026-01-06 00:39:48.254857 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-01-06 00:39:48.254869 | orchestrator | ++ NUMBER_OF_NODES=6 2026-01-06 00:39:48.254875 | orchestrator | ++ export CEPH_VERSION=reef 2026-01-06 00:39:48.254881 | orchestrator | ++ CEPH_VERSION=reef 2026-01-06 00:39:48.254894 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-01-06 00:39:48.254902 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-01-06 00:39:48.254906 | orchestrator | ++ export MANAGER_VERSION=latest 2026-01-06 00:39:48.254914 | orchestrator | ++ MANAGER_VERSION=latest 2026-01-06 00:39:48.254918 | orchestrator | ++ export OPENSTACK_VERSION=2025.1 2026-01-06 00:39:48.254924 | orchestrator | ++ OPENSTACK_VERSION=2025.1 2026-01-06 00:39:48.254927 | orchestrator | ++ export ARA=false 2026-01-06 00:39:48.254932 | orchestrator | ++ ARA=false 2026-01-06 00:39:48.254935 | orchestrator | ++ export DEPLOY_MODE=manager 2026-01-06 00:39:48.254940 | orchestrator | ++ DEPLOY_MODE=manager 2026-01-06 00:39:48.254944 | orchestrator | ++ export TEMPEST=true 2026-01-06 00:39:48.254948 | orchestrator | ++ TEMPEST=true 2026-01-06 00:39:48.254951 | orchestrator | ++ export IS_ZUUL=true 2026-01-06 00:39:48.254955 | orchestrator | ++ IS_ZUUL=true 2026-01-06 00:39:48.254959 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:39:48.254963 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.205 2026-01-06 00:39:48.254967 | orchestrator | ++ export EXTERNAL_API=false 2026-01-06 00:39:48.254970 | orchestrator | ++ EXTERNAL_API=false 2026-01-06 00:39:48.254974 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-01-06 00:39:48.254978 | orchestrator | ++ IMAGE_USER=ubuntu 2026-01-06 00:39:48.254982 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-01-06 00:39:48.254985 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-01-06 00:39:48.254989 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-01-06 00:39:48.254993 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-01-06 00:39:48.255038 | orchestrator | 2026-01-06 00:39:48.255044 | orchestrator | # PULL IMAGES 2026-01-06 00:39:48.255047 | orchestrator | 2026-01-06 00:39:48.255052 | orchestrator | + echo 2026-01-06 00:39:48.255056 | orchestrator | + echo '# PULL IMAGES' 2026-01-06 00:39:48.255060 | orchestrator | + echo 2026-01-06 00:39:48.256253 | orchestrator | ++ semver latest 7.0.0 2026-01-06 00:39:48.326998 | orchestrator | + [[ -1 -ge 0 ]] 2026-01-06 00:39:48.327081 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2026-01-06 00:39:48.327108 | orchestrator | + osism apply --no-wait -r 2 -e custom pull-images 2026-01-06 00:39:50.364799 | orchestrator | 2026-01-06 00:39:50 | INFO  | Trying to run play pull-images in environment custom 2026-01-06 00:40:00.496626 | orchestrator | 2026-01-06 00:40:00 | INFO  | Task 3cca5e2c-5e2f-466b-8cee-d56d38fb4c92 (pull-images) was prepared for execution. 2026-01-06 00:40:00.496800 | orchestrator | 2026-01-06 00:40:00 | INFO  | Task 3cca5e2c-5e2f-466b-8cee-d56d38fb4c92 is running in background. No more output. Check ARA for logs. 2026-01-06 00:40:02.973388 | orchestrator | 2026-01-06 00:40:02 | INFO  | Trying to run play wipe-partitions in environment custom 2026-01-06 00:40:13.053723 | orchestrator | 2026-01-06 00:40:13 | INFO  | Task a71f0e97-54df-4e21-8608-a0f8b08eb2f8 (wipe-partitions) was prepared for execution. 2026-01-06 00:40:13.053853 | orchestrator | 2026-01-06 00:40:13 | INFO  | It takes a moment until task a71f0e97-54df-4e21-8608-a0f8b08eb2f8 (wipe-partitions) has been started and output is visible here. 2026-01-06 00:40:25.487836 | orchestrator | 2026-01-06 00:40:25.487978 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2026-01-06 00:40:25.487996 | orchestrator | 2026-01-06 00:40:25.488009 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2026-01-06 00:40:25.488037 | orchestrator | Tuesday 06 January 2026 00:40:16 +0000 (0:00:00.098) 0:00:00.098 ******* 2026-01-06 00:40:25.488052 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:40:25.488065 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:40:25.488076 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:40:25.488088 | orchestrator | 2026-01-06 00:40:25.488099 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2026-01-06 00:40:25.488173 | orchestrator | Tuesday 06 January 2026 00:40:16 +0000 (0:00:00.609) 0:00:00.708 ******* 2026-01-06 00:40:25.488185 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:40:25.488197 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:40:25.488214 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:40:25.488225 | orchestrator | 2026-01-06 00:40:25.488236 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2026-01-06 00:40:25.488249 | orchestrator | Tuesday 06 January 2026 00:40:17 +0000 (0:00:00.320) 0:00:01.028 ******* 2026-01-06 00:40:25.488269 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:40:25.488290 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:40:25.488310 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:40:25.488328 | orchestrator | 2026-01-06 00:40:25.488347 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2026-01-06 00:40:25.488367 | orchestrator | Tuesday 06 January 2026 00:40:17 +0000 (0:00:00.578) 0:00:01.607 ******* 2026-01-06 00:40:25.488386 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:40:25.488408 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:40:25.488428 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:40:25.488449 | orchestrator | 2026-01-06 00:40:25.488470 | orchestrator | TASK [Check device availability] *********************************************** 2026-01-06 00:40:25.488490 | orchestrator | Tuesday 06 January 2026 00:40:18 +0000 (0:00:00.266) 0:00:01.873 ******* 2026-01-06 00:40:25.488508 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-01-06 00:40:25.488527 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-01-06 00:40:25.488540 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-01-06 00:40:25.488553 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-01-06 00:40:25.488565 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-01-06 00:40:25.488577 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-01-06 00:40:25.488590 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-01-06 00:40:25.488602 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-01-06 00:40:25.488614 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-01-06 00:40:25.488662 | orchestrator | 2026-01-06 00:40:25.488675 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2026-01-06 00:40:25.488690 | orchestrator | Tuesday 06 January 2026 00:40:19 +0000 (0:00:01.234) 0:00:03.108 ******* 2026-01-06 00:40:25.488703 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2026-01-06 00:40:25.488716 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2026-01-06 00:40:25.488726 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2026-01-06 00:40:25.488737 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2026-01-06 00:40:25.488748 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2026-01-06 00:40:25.488758 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2026-01-06 00:40:25.488770 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2026-01-06 00:40:25.488780 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2026-01-06 00:40:25.488791 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2026-01-06 00:40:25.488803 | orchestrator | 2026-01-06 00:40:25.488814 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2026-01-06 00:40:25.488824 | orchestrator | Tuesday 06 January 2026 00:40:20 +0000 (0:00:01.535) 0:00:04.644 ******* 2026-01-06 00:40:25.488835 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-01-06 00:40:25.488846 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-01-06 00:40:25.488857 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-01-06 00:40:25.488868 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-01-06 00:40:25.488879 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-01-06 00:40:25.488898 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-01-06 00:40:25.488909 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-01-06 00:40:25.488932 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-01-06 00:40:25.488943 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-01-06 00:40:25.488954 | orchestrator | 2026-01-06 00:40:25.488966 | orchestrator | TASK [Reload udev rules] ******************************************************* 2026-01-06 00:40:25.488976 | orchestrator | Tuesday 06 January 2026 00:40:23 +0000 (0:00:02.978) 0:00:07.622 ******* 2026-01-06 00:40:25.488987 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:40:25.488998 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:40:25.489009 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:40:25.489020 | orchestrator | 2026-01-06 00:40:25.489030 | orchestrator | TASK [Request device events from the kernel] *********************************** 2026-01-06 00:40:25.489041 | orchestrator | Tuesday 06 January 2026 00:40:24 +0000 (0:00:00.640) 0:00:08.263 ******* 2026-01-06 00:40:25.489053 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:40:25.489063 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:40:25.489074 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:40:25.489085 | orchestrator | 2026-01-06 00:40:25.489096 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:40:25.489109 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:25.489122 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:25.489155 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:25.489167 | orchestrator | 2026-01-06 00:40:25.489178 | orchestrator | 2026-01-06 00:40:25.489189 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:40:25.489200 | orchestrator | Tuesday 06 January 2026 00:40:25 +0000 (0:00:00.674) 0:00:08.937 ******* 2026-01-06 00:40:25.489211 | orchestrator | =============================================================================== 2026-01-06 00:40:25.489222 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.98s 2026-01-06 00:40:25.489233 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.54s 2026-01-06 00:40:25.489244 | orchestrator | Check device availability ----------------------------------------------- 1.23s 2026-01-06 00:40:25.489254 | orchestrator | Request device events from the kernel ----------------------------------- 0.67s 2026-01-06 00:40:25.489265 | orchestrator | Reload udev rules ------------------------------------------------------- 0.64s 2026-01-06 00:40:25.489276 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.61s 2026-01-06 00:40:25.489286 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.58s 2026-01-06 00:40:25.489297 | orchestrator | Remove all rook related logical devices --------------------------------- 0.32s 2026-01-06 00:40:25.489308 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.27s 2026-01-06 00:40:37.959373 | orchestrator | 2026-01-06 00:40:37 | INFO  | Task 3d30937d-d345-4d93-b2e7-8b9baf4cf01e (facts) was prepared for execution. 2026-01-06 00:40:37.959508 | orchestrator | 2026-01-06 00:40:37 | INFO  | It takes a moment until task 3d30937d-d345-4d93-b2e7-8b9baf4cf01e (facts) has been started and output is visible here. 2026-01-06 00:40:51.365065 | orchestrator | 2026-01-06 00:40:51.365180 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-01-06 00:40:51.365197 | orchestrator | 2026-01-06 00:40:51.365210 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-01-06 00:40:51.365222 | orchestrator | Tuesday 06 January 2026 00:40:42 +0000 (0:00:00.280) 0:00:00.281 ******* 2026-01-06 00:40:51.365234 | orchestrator | ok: [testbed-manager] 2026-01-06 00:40:51.365245 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:40:51.365256 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:40:51.365298 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:40:51.365310 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:40:51.365321 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:40:51.365332 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:40:51.365342 | orchestrator | 2026-01-06 00:40:51.365357 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-01-06 00:40:51.365385 | orchestrator | Tuesday 06 January 2026 00:40:43 +0000 (0:00:01.179) 0:00:01.460 ******* 2026-01-06 00:40:51.365408 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:40:51.365420 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:40:51.365432 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:40:51.365443 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:40:51.365453 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:40:51.365464 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:40:51.365475 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:40:51.365486 | orchestrator | 2026-01-06 00:40:51.365497 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-01-06 00:40:51.365508 | orchestrator | 2026-01-06 00:40:51.365519 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:40:51.365530 | orchestrator | Tuesday 06 January 2026 00:40:45 +0000 (0:00:01.389) 0:00:02.849 ******* 2026-01-06 00:40:51.365542 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:40:51.365553 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:40:51.365565 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:40:51.365576 | orchestrator | ok: [testbed-manager] 2026-01-06 00:40:51.365590 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:40:51.365627 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:40:51.365640 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:40:51.365653 | orchestrator | 2026-01-06 00:40:51.365668 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-01-06 00:40:51.365680 | orchestrator | 2026-01-06 00:40:51.365693 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-01-06 00:40:51.365726 | orchestrator | Tuesday 06 January 2026 00:40:50 +0000 (0:00:05.128) 0:00:07.978 ******* 2026-01-06 00:40:51.365740 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:40:51.365753 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:40:51.365766 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:40:51.365779 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:40:51.365791 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:40:51.365804 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:40:51.365817 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:40:51.365830 | orchestrator | 2026-01-06 00:40:51.365843 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:40:51.365856 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365870 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365882 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365894 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365905 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365916 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365927 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:40:51.365938 | orchestrator | 2026-01-06 00:40:51.365958 | orchestrator | 2026-01-06 00:40:51.365970 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:40:51.365981 | orchestrator | Tuesday 06 January 2026 00:40:50 +0000 (0:00:00.568) 0:00:08.547 ******* 2026-01-06 00:40:51.365992 | orchestrator | =============================================================================== 2026-01-06 00:40:51.366003 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.13s 2026-01-06 00:40:51.366075 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.39s 2026-01-06 00:40:51.366088 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.18s 2026-01-06 00:40:51.366099 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.57s 2026-01-06 00:40:53.713234 | orchestrator | 2026-01-06 00:40:53 | INFO  | Task 31a65c50-dec5-4183-a943-86fc501afd95 (ceph-configure-lvm-volumes) was prepared for execution. 2026-01-06 00:40:53.713324 | orchestrator | 2026-01-06 00:40:53 | INFO  | It takes a moment until task 31a65c50-dec5-4183-a943-86fc501afd95 (ceph-configure-lvm-volumes) has been started and output is visible here. 2026-01-06 00:41:04.409910 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-01-06 00:41:04.410077 | orchestrator | 2.16.14 2026-01-06 00:41:04.410095 | orchestrator | 2026-01-06 00:41:04.410102 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-01-06 00:41:04.410110 | orchestrator | 2026-01-06 00:41:04.410119 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:41:04.410126 | orchestrator | Tuesday 06 January 2026 00:40:57 +0000 (0:00:00.289) 0:00:00.289 ******* 2026-01-06 00:41:04.410134 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:04.410141 | orchestrator | 2026-01-06 00:41:04.410148 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:41:04.410154 | orchestrator | Tuesday 06 January 2026 00:40:58 +0000 (0:00:00.242) 0:00:00.532 ******* 2026-01-06 00:41:04.410161 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:41:04.410167 | orchestrator | 2026-01-06 00:41:04.410174 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410180 | orchestrator | Tuesday 06 January 2026 00:40:58 +0000 (0:00:00.195) 0:00:00.727 ******* 2026-01-06 00:41:04.410187 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-01-06 00:41:04.410194 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-01-06 00:41:04.410201 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-01-06 00:41:04.410208 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-01-06 00:41:04.410215 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-01-06 00:41:04.410221 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-01-06 00:41:04.410227 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-01-06 00:41:04.410234 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-01-06 00:41:04.410240 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-01-06 00:41:04.410246 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-01-06 00:41:04.410261 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-01-06 00:41:04.410268 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-01-06 00:41:04.410274 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-01-06 00:41:04.410280 | orchestrator | 2026-01-06 00:41:04.410286 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410316 | orchestrator | Tuesday 06 January 2026 00:40:58 +0000 (0:00:00.435) 0:00:01.163 ******* 2026-01-06 00:41:04.410323 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410331 | orchestrator | 2026-01-06 00:41:04.410337 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410344 | orchestrator | Tuesday 06 January 2026 00:40:58 +0000 (0:00:00.185) 0:00:01.348 ******* 2026-01-06 00:41:04.410349 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410355 | orchestrator | 2026-01-06 00:41:04.410361 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410367 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.202) 0:00:01.550 ******* 2026-01-06 00:41:04.410373 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410380 | orchestrator | 2026-01-06 00:41:04.410386 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410396 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.209) 0:00:01.760 ******* 2026-01-06 00:41:04.410402 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410408 | orchestrator | 2026-01-06 00:41:04.410415 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410421 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.190) 0:00:01.950 ******* 2026-01-06 00:41:04.410427 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410434 | orchestrator | 2026-01-06 00:41:04.410440 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410447 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.181) 0:00:02.132 ******* 2026-01-06 00:41:04.410454 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410461 | orchestrator | 2026-01-06 00:41:04.410468 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410475 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.189) 0:00:02.321 ******* 2026-01-06 00:41:04.410481 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410488 | orchestrator | 2026-01-06 00:41:04.410494 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410501 | orchestrator | Tuesday 06 January 2026 00:40:59 +0000 (0:00:00.179) 0:00:02.501 ******* 2026-01-06 00:41:04.410507 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410514 | orchestrator | 2026-01-06 00:41:04.410520 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410526 | orchestrator | Tuesday 06 January 2026 00:41:00 +0000 (0:00:00.185) 0:00:02.686 ******* 2026-01-06 00:41:04.410533 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907) 2026-01-06 00:41:04.410542 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907) 2026-01-06 00:41:04.410548 | orchestrator | 2026-01-06 00:41:04.410554 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410618 | orchestrator | Tuesday 06 January 2026 00:41:00 +0000 (0:00:00.378) 0:00:03.065 ******* 2026-01-06 00:41:04.410627 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604) 2026-01-06 00:41:04.410633 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604) 2026-01-06 00:41:04.410640 | orchestrator | 2026-01-06 00:41:04.410646 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410654 | orchestrator | Tuesday 06 January 2026 00:41:01 +0000 (0:00:00.534) 0:00:03.599 ******* 2026-01-06 00:41:04.410660 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac) 2026-01-06 00:41:04.410667 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac) 2026-01-06 00:41:04.410674 | orchestrator | 2026-01-06 00:41:04.410682 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410697 | orchestrator | Tuesday 06 January 2026 00:41:01 +0000 (0:00:00.559) 0:00:04.159 ******* 2026-01-06 00:41:04.410704 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088) 2026-01-06 00:41:04.410711 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088) 2026-01-06 00:41:04.410718 | orchestrator | 2026-01-06 00:41:04.410725 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:04.410731 | orchestrator | Tuesday 06 January 2026 00:41:02 +0000 (0:00:00.701) 0:00:04.861 ******* 2026-01-06 00:41:04.410738 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:41:04.410746 | orchestrator | 2026-01-06 00:41:04.410759 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410766 | orchestrator | Tuesday 06 January 2026 00:41:02 +0000 (0:00:00.321) 0:00:05.183 ******* 2026-01-06 00:41:04.410772 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-01-06 00:41:04.410779 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-01-06 00:41:04.410785 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-01-06 00:41:04.410792 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-01-06 00:41:04.410798 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-01-06 00:41:04.410805 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-01-06 00:41:04.410811 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-01-06 00:41:04.410817 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-01-06 00:41:04.410823 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-01-06 00:41:04.410829 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-01-06 00:41:04.410835 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-01-06 00:41:04.410841 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-01-06 00:41:04.410847 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-01-06 00:41:04.410854 | orchestrator | 2026-01-06 00:41:04.410860 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410866 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.337) 0:00:05.521 ******* 2026-01-06 00:41:04.410872 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410879 | orchestrator | 2026-01-06 00:41:04.410886 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410892 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.198) 0:00:05.720 ******* 2026-01-06 00:41:04.410898 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410903 | orchestrator | 2026-01-06 00:41:04.410910 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410916 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.186) 0:00:05.906 ******* 2026-01-06 00:41:04.410923 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410930 | orchestrator | 2026-01-06 00:41:04.410937 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410943 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.189) 0:00:06.096 ******* 2026-01-06 00:41:04.410949 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410955 | orchestrator | 2026-01-06 00:41:04.410961 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410968 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.193) 0:00:06.289 ******* 2026-01-06 00:41:04.410981 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.410987 | orchestrator | 2026-01-06 00:41:04.410993 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.410999 | orchestrator | Tuesday 06 January 2026 00:41:03 +0000 (0:00:00.213) 0:00:06.503 ******* 2026-01-06 00:41:04.411006 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.411012 | orchestrator | 2026-01-06 00:41:04.411017 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:04.411024 | orchestrator | Tuesday 06 January 2026 00:41:04 +0000 (0:00:00.205) 0:00:06.709 ******* 2026-01-06 00:41:04.411031 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:04.411038 | orchestrator | 2026-01-06 00:41:04.411053 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870499 | orchestrator | Tuesday 06 January 2026 00:41:04 +0000 (0:00:00.197) 0:00:06.907 ******* 2026-01-06 00:41:11.870691 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.870714 | orchestrator | 2026-01-06 00:41:11.870727 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870740 | orchestrator | Tuesday 06 January 2026 00:41:04 +0000 (0:00:00.231) 0:00:07.138 ******* 2026-01-06 00:41:11.870751 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-01-06 00:41:11.870764 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-01-06 00:41:11.870775 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-01-06 00:41:11.870786 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-01-06 00:41:11.870797 | orchestrator | 2026-01-06 00:41:11.870808 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870819 | orchestrator | Tuesday 06 January 2026 00:41:05 +0000 (0:00:01.089) 0:00:08.227 ******* 2026-01-06 00:41:11.870830 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.870841 | orchestrator | 2026-01-06 00:41:11.870852 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870863 | orchestrator | Tuesday 06 January 2026 00:41:05 +0000 (0:00:00.209) 0:00:08.437 ******* 2026-01-06 00:41:11.870874 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.870885 | orchestrator | 2026-01-06 00:41:11.870896 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870907 | orchestrator | Tuesday 06 January 2026 00:41:06 +0000 (0:00:00.229) 0:00:08.666 ******* 2026-01-06 00:41:11.870918 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.870929 | orchestrator | 2026-01-06 00:41:11.870940 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:11.870951 | orchestrator | Tuesday 06 January 2026 00:41:06 +0000 (0:00:00.196) 0:00:08.863 ******* 2026-01-06 00:41:11.870961 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.870972 | orchestrator | 2026-01-06 00:41:11.870983 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-01-06 00:41:11.870995 | orchestrator | Tuesday 06 January 2026 00:41:06 +0000 (0:00:00.199) 0:00:09.063 ******* 2026-01-06 00:41:11.871007 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2026-01-06 00:41:11.871021 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2026-01-06 00:41:11.871034 | orchestrator | 2026-01-06 00:41:11.871067 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-01-06 00:41:11.871081 | orchestrator | Tuesday 06 January 2026 00:41:06 +0000 (0:00:00.208) 0:00:09.271 ******* 2026-01-06 00:41:11.871094 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871106 | orchestrator | 2026-01-06 00:41:11.871119 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-01-06 00:41:11.871132 | orchestrator | Tuesday 06 January 2026 00:41:06 +0000 (0:00:00.132) 0:00:09.404 ******* 2026-01-06 00:41:11.871145 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871157 | orchestrator | 2026-01-06 00:41:11.871175 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-01-06 00:41:11.871230 | orchestrator | Tuesday 06 January 2026 00:41:07 +0000 (0:00:00.140) 0:00:09.544 ******* 2026-01-06 00:41:11.871252 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871272 | orchestrator | 2026-01-06 00:41:11.871292 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-01-06 00:41:11.871311 | orchestrator | Tuesday 06 January 2026 00:41:07 +0000 (0:00:00.135) 0:00:09.680 ******* 2026-01-06 00:41:11.871330 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:41:11.871345 | orchestrator | 2026-01-06 00:41:11.871356 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-01-06 00:41:11.871367 | orchestrator | Tuesday 06 January 2026 00:41:07 +0000 (0:00:00.158) 0:00:09.839 ******* 2026-01-06 00:41:11.871379 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'd44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}}) 2026-01-06 00:41:11.871391 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '1f440738-8941-5354-ae19-38cd939f8e8b'}}) 2026-01-06 00:41:11.871402 | orchestrator | 2026-01-06 00:41:11.871413 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-01-06 00:41:11.871425 | orchestrator | Tuesday 06 January 2026 00:41:07 +0000 (0:00:00.178) 0:00:10.017 ******* 2026-01-06 00:41:11.871437 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'd44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}})  2026-01-06 00:41:11.871456 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '1f440738-8941-5354-ae19-38cd939f8e8b'}})  2026-01-06 00:41:11.871468 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871479 | orchestrator | 2026-01-06 00:41:11.871490 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-01-06 00:41:11.871501 | orchestrator | Tuesday 06 January 2026 00:41:07 +0000 (0:00:00.170) 0:00:10.187 ******* 2026-01-06 00:41:11.871512 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'd44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}})  2026-01-06 00:41:11.871523 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '1f440738-8941-5354-ae19-38cd939f8e8b'}})  2026-01-06 00:41:11.871534 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871546 | orchestrator | 2026-01-06 00:41:11.871557 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-01-06 00:41:11.871568 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.368) 0:00:10.556 ******* 2026-01-06 00:41:11.871602 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'd44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}})  2026-01-06 00:41:11.871635 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '1f440738-8941-5354-ae19-38cd939f8e8b'}})  2026-01-06 00:41:11.871647 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871658 | orchestrator | 2026-01-06 00:41:11.871669 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-01-06 00:41:11.871687 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.159) 0:00:10.716 ******* 2026-01-06 00:41:11.871698 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:41:11.871709 | orchestrator | 2026-01-06 00:41:11.871720 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-01-06 00:41:11.871731 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.141) 0:00:10.858 ******* 2026-01-06 00:41:11.871742 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:41:11.871753 | orchestrator | 2026-01-06 00:41:11.871764 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-01-06 00:41:11.871775 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.156) 0:00:11.015 ******* 2026-01-06 00:41:11.871785 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871796 | orchestrator | 2026-01-06 00:41:11.871807 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-01-06 00:41:11.871818 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.151) 0:00:11.167 ******* 2026-01-06 00:41:11.871839 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871850 | orchestrator | 2026-01-06 00:41:11.871861 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-01-06 00:41:11.871872 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.142) 0:00:11.309 ******* 2026-01-06 00:41:11.871883 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.871894 | orchestrator | 2026-01-06 00:41:11.871905 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-01-06 00:41:11.871916 | orchestrator | Tuesday 06 January 2026 00:41:08 +0000 (0:00:00.131) 0:00:11.441 ******* 2026-01-06 00:41:11.871927 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:41:11.871938 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:11.871949 | orchestrator |  "sdb": { 2026-01-06 00:41:11.871961 | orchestrator |  "osd_lvm_uuid": "d44b25a4-5c87-5b50-a8b5-4ed8c19ba382" 2026-01-06 00:41:11.871972 | orchestrator |  }, 2026-01-06 00:41:11.871984 | orchestrator |  "sdc": { 2026-01-06 00:41:11.871995 | orchestrator |  "osd_lvm_uuid": "1f440738-8941-5354-ae19-38cd939f8e8b" 2026-01-06 00:41:11.872006 | orchestrator |  } 2026-01-06 00:41:11.872017 | orchestrator |  } 2026-01-06 00:41:11.872028 | orchestrator | } 2026-01-06 00:41:11.872039 | orchestrator | 2026-01-06 00:41:11.872050 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-01-06 00:41:11.872061 | orchestrator | Tuesday 06 January 2026 00:41:09 +0000 (0:00:00.129) 0:00:11.570 ******* 2026-01-06 00:41:11.872072 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.872083 | orchestrator | 2026-01-06 00:41:11.872094 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-01-06 00:41:11.872105 | orchestrator | Tuesday 06 January 2026 00:41:09 +0000 (0:00:00.126) 0:00:11.696 ******* 2026-01-06 00:41:11.872115 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.872126 | orchestrator | 2026-01-06 00:41:11.872137 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-01-06 00:41:11.872148 | orchestrator | Tuesday 06 January 2026 00:41:09 +0000 (0:00:00.115) 0:00:11.812 ******* 2026-01-06 00:41:11.872159 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:41:11.872170 | orchestrator | 2026-01-06 00:41:11.872181 | orchestrator | TASK [Print configuration data] ************************************************ 2026-01-06 00:41:11.872191 | orchestrator | Tuesday 06 January 2026 00:41:09 +0000 (0:00:00.142) 0:00:11.954 ******* 2026-01-06 00:41:11.872202 | orchestrator | changed: [testbed-node-3] => { 2026-01-06 00:41:11.872214 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-01-06 00:41:11.872224 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:11.872235 | orchestrator |  "sdb": { 2026-01-06 00:41:11.872246 | orchestrator |  "osd_lvm_uuid": "d44b25a4-5c87-5b50-a8b5-4ed8c19ba382" 2026-01-06 00:41:11.872257 | orchestrator |  }, 2026-01-06 00:41:11.872268 | orchestrator |  "sdc": { 2026-01-06 00:41:11.872279 | orchestrator |  "osd_lvm_uuid": "1f440738-8941-5354-ae19-38cd939f8e8b" 2026-01-06 00:41:11.872290 | orchestrator |  } 2026-01-06 00:41:11.872301 | orchestrator |  }, 2026-01-06 00:41:11.872312 | orchestrator |  "lvm_volumes": [ 2026-01-06 00:41:11.872323 | orchestrator |  { 2026-01-06 00:41:11.872334 | orchestrator |  "data": "osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382", 2026-01-06 00:41:11.872345 | orchestrator |  "data_vg": "ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382" 2026-01-06 00:41:11.872356 | orchestrator |  }, 2026-01-06 00:41:11.872366 | orchestrator |  { 2026-01-06 00:41:11.872377 | orchestrator |  "data": "osd-block-1f440738-8941-5354-ae19-38cd939f8e8b", 2026-01-06 00:41:11.872388 | orchestrator |  "data_vg": "ceph-1f440738-8941-5354-ae19-38cd939f8e8b" 2026-01-06 00:41:11.872404 | orchestrator |  } 2026-01-06 00:41:11.872415 | orchestrator |  ] 2026-01-06 00:41:11.872426 | orchestrator |  } 2026-01-06 00:41:11.872446 | orchestrator | } 2026-01-06 00:41:11.872458 | orchestrator | 2026-01-06 00:41:11.872469 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-01-06 00:41:11.872480 | orchestrator | Tuesday 06 January 2026 00:41:09 +0000 (0:00:00.324) 0:00:12.279 ******* 2026-01-06 00:41:11.872491 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:11.872502 | orchestrator | 2026-01-06 00:41:11.872512 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-01-06 00:41:11.872523 | orchestrator | 2026-01-06 00:41:11.872534 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:41:11.872545 | orchestrator | Tuesday 06 January 2026 00:41:11 +0000 (0:00:01.639) 0:00:13.919 ******* 2026-01-06 00:41:11.872556 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:11.872567 | orchestrator | 2026-01-06 00:41:11.872595 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:41:11.872606 | orchestrator | Tuesday 06 January 2026 00:41:11 +0000 (0:00:00.230) 0:00:14.149 ******* 2026-01-06 00:41:11.872617 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:41:11.872628 | orchestrator | 2026-01-06 00:41:11.872646 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.851871 | orchestrator | Tuesday 06 January 2026 00:41:11 +0000 (0:00:00.223) 0:00:14.373 ******* 2026-01-06 00:41:19.851968 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-01-06 00:41:19.851974 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-01-06 00:41:19.851979 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-01-06 00:41:19.851983 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-01-06 00:41:19.851987 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-01-06 00:41:19.851991 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-01-06 00:41:19.851995 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-01-06 00:41:19.851999 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-01-06 00:41:19.852003 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-01-06 00:41:19.852007 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-01-06 00:41:19.852011 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-01-06 00:41:19.852018 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-01-06 00:41:19.852022 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-01-06 00:41:19.852027 | orchestrator | 2026-01-06 00:41:19.852031 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852035 | orchestrator | Tuesday 06 January 2026 00:41:12 +0000 (0:00:00.349) 0:00:14.722 ******* 2026-01-06 00:41:19.852039 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852044 | orchestrator | 2026-01-06 00:41:19.852048 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852052 | orchestrator | Tuesday 06 January 2026 00:41:12 +0000 (0:00:00.187) 0:00:14.910 ******* 2026-01-06 00:41:19.852055 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852059 | orchestrator | 2026-01-06 00:41:19.852063 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852067 | orchestrator | Tuesday 06 January 2026 00:41:12 +0000 (0:00:00.189) 0:00:15.099 ******* 2026-01-06 00:41:19.852071 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852075 | orchestrator | 2026-01-06 00:41:19.852079 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852108 | orchestrator | Tuesday 06 January 2026 00:41:12 +0000 (0:00:00.197) 0:00:15.297 ******* 2026-01-06 00:41:19.852116 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852122 | orchestrator | 2026-01-06 00:41:19.852129 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852135 | orchestrator | Tuesday 06 January 2026 00:41:12 +0000 (0:00:00.183) 0:00:15.480 ******* 2026-01-06 00:41:19.852142 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852148 | orchestrator | 2026-01-06 00:41:19.852155 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852161 | orchestrator | Tuesday 06 January 2026 00:41:13 +0000 (0:00:00.503) 0:00:15.984 ******* 2026-01-06 00:41:19.852168 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852175 | orchestrator | 2026-01-06 00:41:19.852200 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852207 | orchestrator | Tuesday 06 January 2026 00:41:13 +0000 (0:00:00.184) 0:00:16.168 ******* 2026-01-06 00:41:19.852213 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852220 | orchestrator | 2026-01-06 00:41:19.852227 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852234 | orchestrator | Tuesday 06 January 2026 00:41:13 +0000 (0:00:00.183) 0:00:16.352 ******* 2026-01-06 00:41:19.852241 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852248 | orchestrator | 2026-01-06 00:41:19.852255 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852262 | orchestrator | Tuesday 06 January 2026 00:41:14 +0000 (0:00:00.183) 0:00:16.535 ******* 2026-01-06 00:41:19.852269 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a) 2026-01-06 00:41:19.852277 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a) 2026-01-06 00:41:19.852284 | orchestrator | 2026-01-06 00:41:19.852290 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852298 | orchestrator | Tuesday 06 January 2026 00:41:14 +0000 (0:00:00.433) 0:00:16.969 ******* 2026-01-06 00:41:19.852305 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585) 2026-01-06 00:41:19.852312 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585) 2026-01-06 00:41:19.852316 | orchestrator | 2026-01-06 00:41:19.852320 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852324 | orchestrator | Tuesday 06 January 2026 00:41:14 +0000 (0:00:00.460) 0:00:17.430 ******* 2026-01-06 00:41:19.852328 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6) 2026-01-06 00:41:19.852332 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6) 2026-01-06 00:41:19.852336 | orchestrator | 2026-01-06 00:41:19.852340 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852356 | orchestrator | Tuesday 06 January 2026 00:41:15 +0000 (0:00:00.453) 0:00:17.883 ******* 2026-01-06 00:41:19.852360 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6) 2026-01-06 00:41:19.852364 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6) 2026-01-06 00:41:19.852368 | orchestrator | 2026-01-06 00:41:19.852372 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:19.852376 | orchestrator | Tuesday 06 January 2026 00:41:15 +0000 (0:00:00.432) 0:00:18.316 ******* 2026-01-06 00:41:19.852380 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:41:19.852384 | orchestrator | 2026-01-06 00:41:19.852387 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852391 | orchestrator | Tuesday 06 January 2026 00:41:16 +0000 (0:00:00.350) 0:00:18.666 ******* 2026-01-06 00:41:19.852401 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-01-06 00:41:19.852405 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-01-06 00:41:19.852409 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-01-06 00:41:19.852413 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-01-06 00:41:19.852416 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-01-06 00:41:19.852421 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-01-06 00:41:19.852425 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-01-06 00:41:19.852429 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-01-06 00:41:19.852434 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-01-06 00:41:19.852438 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-01-06 00:41:19.852443 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-01-06 00:41:19.852447 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-01-06 00:41:19.852451 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-01-06 00:41:19.852456 | orchestrator | 2026-01-06 00:41:19.852460 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852465 | orchestrator | Tuesday 06 January 2026 00:41:16 +0000 (0:00:00.378) 0:00:19.045 ******* 2026-01-06 00:41:19.852469 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852473 | orchestrator | 2026-01-06 00:41:19.852478 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852488 | orchestrator | Tuesday 06 January 2026 00:41:17 +0000 (0:00:00.695) 0:00:19.740 ******* 2026-01-06 00:41:19.852493 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852497 | orchestrator | 2026-01-06 00:41:19.852502 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852506 | orchestrator | Tuesday 06 January 2026 00:41:17 +0000 (0:00:00.241) 0:00:19.982 ******* 2026-01-06 00:41:19.852511 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852515 | orchestrator | 2026-01-06 00:41:19.852520 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852524 | orchestrator | Tuesday 06 January 2026 00:41:17 +0000 (0:00:00.199) 0:00:20.181 ******* 2026-01-06 00:41:19.852529 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852533 | orchestrator | 2026-01-06 00:41:19.852537 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852542 | orchestrator | Tuesday 06 January 2026 00:41:17 +0000 (0:00:00.190) 0:00:20.372 ******* 2026-01-06 00:41:19.852547 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852551 | orchestrator | 2026-01-06 00:41:19.852555 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852560 | orchestrator | Tuesday 06 January 2026 00:41:18 +0000 (0:00:00.190) 0:00:20.562 ******* 2026-01-06 00:41:19.852579 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852584 | orchestrator | 2026-01-06 00:41:19.852588 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852593 | orchestrator | Tuesday 06 January 2026 00:41:18 +0000 (0:00:00.196) 0:00:20.759 ******* 2026-01-06 00:41:19.852597 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852602 | orchestrator | 2026-01-06 00:41:19.852606 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852611 | orchestrator | Tuesday 06 January 2026 00:41:18 +0000 (0:00:00.196) 0:00:20.955 ******* 2026-01-06 00:41:19.852619 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:19.852623 | orchestrator | 2026-01-06 00:41:19.852627 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852632 | orchestrator | Tuesday 06 January 2026 00:41:18 +0000 (0:00:00.226) 0:00:21.182 ******* 2026-01-06 00:41:19.852636 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-01-06 00:41:19.852641 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-01-06 00:41:19.852646 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-01-06 00:41:19.852650 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-01-06 00:41:19.852655 | orchestrator | 2026-01-06 00:41:19.852659 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:19.852664 | orchestrator | Tuesday 06 January 2026 00:41:19 +0000 (0:00:00.959) 0:00:22.141 ******* 2026-01-06 00:41:19.852668 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163200 | orchestrator | 2026-01-06 00:41:27.163296 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:27.163308 | orchestrator | Tuesday 06 January 2026 00:41:19 +0000 (0:00:00.213) 0:00:22.355 ******* 2026-01-06 00:41:27.163316 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163324 | orchestrator | 2026-01-06 00:41:27.163331 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:27.163338 | orchestrator | Tuesday 06 January 2026 00:41:20 +0000 (0:00:00.209) 0:00:22.564 ******* 2026-01-06 00:41:27.163344 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163351 | orchestrator | 2026-01-06 00:41:27.163358 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:27.163365 | orchestrator | Tuesday 06 January 2026 00:41:20 +0000 (0:00:00.211) 0:00:22.775 ******* 2026-01-06 00:41:27.163371 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163378 | orchestrator | 2026-01-06 00:41:27.163384 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-01-06 00:41:27.163391 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.857) 0:00:23.632 ******* 2026-01-06 00:41:27.163397 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2026-01-06 00:41:27.163404 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2026-01-06 00:41:27.163410 | orchestrator | 2026-01-06 00:41:27.163416 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-01-06 00:41:27.163423 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.216) 0:00:23.848 ******* 2026-01-06 00:41:27.163429 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163436 | orchestrator | 2026-01-06 00:41:27.163443 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-01-06 00:41:27.163449 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.145) 0:00:23.994 ******* 2026-01-06 00:41:27.163455 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163462 | orchestrator | 2026-01-06 00:41:27.163481 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-01-06 00:41:27.163495 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.176) 0:00:24.171 ******* 2026-01-06 00:41:27.163502 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163508 | orchestrator | 2026-01-06 00:41:27.163515 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-01-06 00:41:27.163521 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.161) 0:00:24.332 ******* 2026-01-06 00:41:27.163527 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:41:27.163534 | orchestrator | 2026-01-06 00:41:27.163541 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-01-06 00:41:27.163547 | orchestrator | Tuesday 06 January 2026 00:41:21 +0000 (0:00:00.133) 0:00:24.466 ******* 2026-01-06 00:41:27.163597 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '64d6825f-3ec1-5927-8c89-e441ee427e8a'}}) 2026-01-06 00:41:27.163607 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e675238b-4f6c-5157-bfd7-95a1b3a689b7'}}) 2026-01-06 00:41:27.163635 | orchestrator | 2026-01-06 00:41:27.163642 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-01-06 00:41:27.163648 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.202) 0:00:24.669 ******* 2026-01-06 00:41:27.163657 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '64d6825f-3ec1-5927-8c89-e441ee427e8a'}})  2026-01-06 00:41:27.163690 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e675238b-4f6c-5157-bfd7-95a1b3a689b7'}})  2026-01-06 00:41:27.163703 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163713 | orchestrator | 2026-01-06 00:41:27.163723 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-01-06 00:41:27.163734 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.151) 0:00:24.821 ******* 2026-01-06 00:41:27.163746 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '64d6825f-3ec1-5927-8c89-e441ee427e8a'}})  2026-01-06 00:41:27.163758 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e675238b-4f6c-5157-bfd7-95a1b3a689b7'}})  2026-01-06 00:41:27.163769 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163780 | orchestrator | 2026-01-06 00:41:27.163790 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-01-06 00:41:27.163801 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.180) 0:00:25.001 ******* 2026-01-06 00:41:27.163813 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '64d6825f-3ec1-5927-8c89-e441ee427e8a'}})  2026-01-06 00:41:27.163825 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e675238b-4f6c-5157-bfd7-95a1b3a689b7'}})  2026-01-06 00:41:27.163835 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163846 | orchestrator | 2026-01-06 00:41:27.163857 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-01-06 00:41:27.163868 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.163) 0:00:25.165 ******* 2026-01-06 00:41:27.163879 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:41:27.163887 | orchestrator | 2026-01-06 00:41:27.163894 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-01-06 00:41:27.163900 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.162) 0:00:25.328 ******* 2026-01-06 00:41:27.163906 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:41:27.163913 | orchestrator | 2026-01-06 00:41:27.163919 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-01-06 00:41:27.163925 | orchestrator | Tuesday 06 January 2026 00:41:22 +0000 (0:00:00.155) 0:00:25.483 ******* 2026-01-06 00:41:27.163947 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163953 | orchestrator | 2026-01-06 00:41:27.163960 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-01-06 00:41:27.163966 | orchestrator | Tuesday 06 January 2026 00:41:23 +0000 (0:00:00.421) 0:00:25.905 ******* 2026-01-06 00:41:27.163972 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.163978 | orchestrator | 2026-01-06 00:41:27.163985 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-01-06 00:41:27.163991 | orchestrator | Tuesday 06 January 2026 00:41:23 +0000 (0:00:00.127) 0:00:26.032 ******* 2026-01-06 00:41:27.163997 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.164003 | orchestrator | 2026-01-06 00:41:27.164009 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-01-06 00:41:27.164015 | orchestrator | Tuesday 06 January 2026 00:41:23 +0000 (0:00:00.129) 0:00:26.161 ******* 2026-01-06 00:41:27.164021 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:41:27.164027 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:27.164034 | orchestrator |  "sdb": { 2026-01-06 00:41:27.164041 | orchestrator |  "osd_lvm_uuid": "64d6825f-3ec1-5927-8c89-e441ee427e8a" 2026-01-06 00:41:27.164056 | orchestrator |  }, 2026-01-06 00:41:27.164063 | orchestrator |  "sdc": { 2026-01-06 00:41:27.164069 | orchestrator |  "osd_lvm_uuid": "e675238b-4f6c-5157-bfd7-95a1b3a689b7" 2026-01-06 00:41:27.164075 | orchestrator |  } 2026-01-06 00:41:27.164082 | orchestrator |  } 2026-01-06 00:41:27.164088 | orchestrator | } 2026-01-06 00:41:27.164094 | orchestrator | 2026-01-06 00:41:27.164101 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-01-06 00:41:27.164107 | orchestrator | Tuesday 06 January 2026 00:41:23 +0000 (0:00:00.136) 0:00:26.298 ******* 2026-01-06 00:41:27.164113 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.164119 | orchestrator | 2026-01-06 00:41:27.164125 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-01-06 00:41:27.164131 | orchestrator | Tuesday 06 January 2026 00:41:23 +0000 (0:00:00.144) 0:00:26.443 ******* 2026-01-06 00:41:27.164137 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.164143 | orchestrator | 2026-01-06 00:41:27.164149 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-01-06 00:41:27.164155 | orchestrator | Tuesday 06 January 2026 00:41:24 +0000 (0:00:00.130) 0:00:26.574 ******* 2026-01-06 00:41:27.164162 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:41:27.164168 | orchestrator | 2026-01-06 00:41:27.164174 | orchestrator | TASK [Print configuration data] ************************************************ 2026-01-06 00:41:27.164180 | orchestrator | Tuesday 06 January 2026 00:41:24 +0000 (0:00:00.148) 0:00:26.722 ******* 2026-01-06 00:41:27.164186 | orchestrator | changed: [testbed-node-4] => { 2026-01-06 00:41:27.164192 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-01-06 00:41:27.164200 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:27.164211 | orchestrator |  "sdb": { 2026-01-06 00:41:27.164221 | orchestrator |  "osd_lvm_uuid": "64d6825f-3ec1-5927-8c89-e441ee427e8a" 2026-01-06 00:41:27.164231 | orchestrator |  }, 2026-01-06 00:41:27.164242 | orchestrator |  "sdc": { 2026-01-06 00:41:27.164253 | orchestrator |  "osd_lvm_uuid": "e675238b-4f6c-5157-bfd7-95a1b3a689b7" 2026-01-06 00:41:27.164262 | orchestrator |  } 2026-01-06 00:41:27.164272 | orchestrator |  }, 2026-01-06 00:41:27.164282 | orchestrator |  "lvm_volumes": [ 2026-01-06 00:41:27.164292 | orchestrator |  { 2026-01-06 00:41:27.164302 | orchestrator |  "data": "osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a", 2026-01-06 00:41:27.164312 | orchestrator |  "data_vg": "ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a" 2026-01-06 00:41:27.164318 | orchestrator |  }, 2026-01-06 00:41:27.164324 | orchestrator |  { 2026-01-06 00:41:27.164330 | orchestrator |  "data": "osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7", 2026-01-06 00:41:27.164337 | orchestrator |  "data_vg": "ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7" 2026-01-06 00:41:27.164343 | orchestrator |  } 2026-01-06 00:41:27.164349 | orchestrator |  ] 2026-01-06 00:41:27.164355 | orchestrator |  } 2026-01-06 00:41:27.164361 | orchestrator | } 2026-01-06 00:41:27.164367 | orchestrator | 2026-01-06 00:41:27.164373 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-01-06 00:41:27.164380 | orchestrator | Tuesday 06 January 2026 00:41:24 +0000 (0:00:00.210) 0:00:26.933 ******* 2026-01-06 00:41:27.164386 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:27.164392 | orchestrator | 2026-01-06 00:41:27.164398 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-01-06 00:41:27.164404 | orchestrator | 2026-01-06 00:41:27.164410 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:41:27.164416 | orchestrator | Tuesday 06 January 2026 00:41:25 +0000 (0:00:01.169) 0:00:28.102 ******* 2026-01-06 00:41:27.164422 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:27.164429 | orchestrator | 2026-01-06 00:41:27.164435 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:41:27.164452 | orchestrator | Tuesday 06 January 2026 00:41:26 +0000 (0:00:00.836) 0:00:28.939 ******* 2026-01-06 00:41:27.164459 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:41:27.164465 | orchestrator | 2026-01-06 00:41:27.164471 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:27.164477 | orchestrator | Tuesday 06 January 2026 00:41:26 +0000 (0:00:00.298) 0:00:29.237 ******* 2026-01-06 00:41:27.164483 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-01-06 00:41:27.164489 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-01-06 00:41:27.164496 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-01-06 00:41:27.164502 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-01-06 00:41:27.164508 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-01-06 00:41:27.164519 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-01-06 00:41:34.585324 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-01-06 00:41:34.585421 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-01-06 00:41:34.585432 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-01-06 00:41:34.585439 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-01-06 00:41:34.585445 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-01-06 00:41:34.585452 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-01-06 00:41:34.585458 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-01-06 00:41:34.585465 | orchestrator | 2026-01-06 00:41:34.585473 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585480 | orchestrator | Tuesday 06 January 2026 00:41:27 +0000 (0:00:00.422) 0:00:29.660 ******* 2026-01-06 00:41:34.585487 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585494 | orchestrator | 2026-01-06 00:41:34.585500 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585506 | orchestrator | Tuesday 06 January 2026 00:41:27 +0000 (0:00:00.219) 0:00:29.880 ******* 2026-01-06 00:41:34.585512 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585519 | orchestrator | 2026-01-06 00:41:34.585525 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585531 | orchestrator | Tuesday 06 January 2026 00:41:27 +0000 (0:00:00.219) 0:00:30.100 ******* 2026-01-06 00:41:34.585537 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585544 | orchestrator | 2026-01-06 00:41:34.585577 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585587 | orchestrator | Tuesday 06 January 2026 00:41:27 +0000 (0:00:00.214) 0:00:30.314 ******* 2026-01-06 00:41:34.585597 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585608 | orchestrator | 2026-01-06 00:41:34.585618 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585628 | orchestrator | Tuesday 06 January 2026 00:41:28 +0000 (0:00:00.224) 0:00:30.539 ******* 2026-01-06 00:41:34.585638 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585650 | orchestrator | 2026-01-06 00:41:34.585660 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585671 | orchestrator | Tuesday 06 January 2026 00:41:28 +0000 (0:00:00.252) 0:00:30.791 ******* 2026-01-06 00:41:34.585678 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585685 | orchestrator | 2026-01-06 00:41:34.585691 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585726 | orchestrator | Tuesday 06 January 2026 00:41:28 +0000 (0:00:00.189) 0:00:30.981 ******* 2026-01-06 00:41:34.585733 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585739 | orchestrator | 2026-01-06 00:41:34.585745 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585752 | orchestrator | Tuesday 06 January 2026 00:41:28 +0000 (0:00:00.173) 0:00:31.154 ******* 2026-01-06 00:41:34.585758 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.585764 | orchestrator | 2026-01-06 00:41:34.585771 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585777 | orchestrator | Tuesday 06 January 2026 00:41:28 +0000 (0:00:00.187) 0:00:31.341 ******* 2026-01-06 00:41:34.585783 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22) 2026-01-06 00:41:34.585791 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22) 2026-01-06 00:41:34.585797 | orchestrator | 2026-01-06 00:41:34.585803 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585809 | orchestrator | Tuesday 06 January 2026 00:41:29 +0000 (0:00:00.687) 0:00:32.029 ******* 2026-01-06 00:41:34.585816 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5) 2026-01-06 00:41:34.585822 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5) 2026-01-06 00:41:34.585828 | orchestrator | 2026-01-06 00:41:34.585834 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585840 | orchestrator | Tuesday 06 January 2026 00:41:29 +0000 (0:00:00.408) 0:00:32.437 ******* 2026-01-06 00:41:34.585848 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3) 2026-01-06 00:41:34.585855 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3) 2026-01-06 00:41:34.585863 | orchestrator | 2026-01-06 00:41:34.585870 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585877 | orchestrator | Tuesday 06 January 2026 00:41:30 +0000 (0:00:00.401) 0:00:32.838 ******* 2026-01-06 00:41:34.585884 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59) 2026-01-06 00:41:34.585891 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59) 2026-01-06 00:41:34.585898 | orchestrator | 2026-01-06 00:41:34.585906 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:41:34.585913 | orchestrator | Tuesday 06 January 2026 00:41:30 +0000 (0:00:00.385) 0:00:33.223 ******* 2026-01-06 00:41:34.585921 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:41:34.585931 | orchestrator | 2026-01-06 00:41:34.585942 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.585972 | orchestrator | Tuesday 06 January 2026 00:41:31 +0000 (0:00:00.304) 0:00:33.528 ******* 2026-01-06 00:41:34.585983 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-01-06 00:41:34.585993 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-01-06 00:41:34.586004 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-01-06 00:41:34.586065 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-01-06 00:41:34.586073 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-01-06 00:41:34.586096 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-01-06 00:41:34.586104 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-01-06 00:41:34.586110 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-01-06 00:41:34.586125 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-01-06 00:41:34.586131 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-01-06 00:41:34.586137 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-01-06 00:41:34.586143 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-01-06 00:41:34.586149 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-01-06 00:41:34.586156 | orchestrator | 2026-01-06 00:41:34.586162 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586168 | orchestrator | Tuesday 06 January 2026 00:41:31 +0000 (0:00:00.349) 0:00:33.877 ******* 2026-01-06 00:41:34.586174 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586180 | orchestrator | 2026-01-06 00:41:34.586186 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586193 | orchestrator | Tuesday 06 January 2026 00:41:31 +0000 (0:00:00.197) 0:00:34.075 ******* 2026-01-06 00:41:34.586199 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586205 | orchestrator | 2026-01-06 00:41:34.586211 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586221 | orchestrator | Tuesday 06 January 2026 00:41:31 +0000 (0:00:00.219) 0:00:34.294 ******* 2026-01-06 00:41:34.586227 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586234 | orchestrator | 2026-01-06 00:41:34.586240 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586248 | orchestrator | Tuesday 06 January 2026 00:41:31 +0000 (0:00:00.175) 0:00:34.470 ******* 2026-01-06 00:41:34.586258 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586271 | orchestrator | 2026-01-06 00:41:34.586286 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586296 | orchestrator | Tuesday 06 January 2026 00:41:32 +0000 (0:00:00.194) 0:00:34.664 ******* 2026-01-06 00:41:34.586306 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586316 | orchestrator | 2026-01-06 00:41:34.586326 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586336 | orchestrator | Tuesday 06 January 2026 00:41:32 +0000 (0:00:00.214) 0:00:34.879 ******* 2026-01-06 00:41:34.586346 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586355 | orchestrator | 2026-01-06 00:41:34.586365 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586375 | orchestrator | Tuesday 06 January 2026 00:41:32 +0000 (0:00:00.501) 0:00:35.381 ******* 2026-01-06 00:41:34.586385 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586396 | orchestrator | 2026-01-06 00:41:34.586406 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586417 | orchestrator | Tuesday 06 January 2026 00:41:33 +0000 (0:00:00.184) 0:00:35.565 ******* 2026-01-06 00:41:34.586427 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586437 | orchestrator | 2026-01-06 00:41:34.586448 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586458 | orchestrator | Tuesday 06 January 2026 00:41:33 +0000 (0:00:00.204) 0:00:35.770 ******* 2026-01-06 00:41:34.586469 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-01-06 00:41:34.586479 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-01-06 00:41:34.586489 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-01-06 00:41:34.586500 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-01-06 00:41:34.586510 | orchestrator | 2026-01-06 00:41:34.586521 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586531 | orchestrator | Tuesday 06 January 2026 00:41:33 +0000 (0:00:00.668) 0:00:36.438 ******* 2026-01-06 00:41:34.586542 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586608 | orchestrator | 2026-01-06 00:41:34.586619 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586629 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.203) 0:00:36.642 ******* 2026-01-06 00:41:34.586639 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586649 | orchestrator | 2026-01-06 00:41:34.586660 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586671 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.166) 0:00:36.809 ******* 2026-01-06 00:41:34.586681 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586691 | orchestrator | 2026-01-06 00:41:34.586702 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:41:34.586712 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.141) 0:00:36.950 ******* 2026-01-06 00:41:34.586723 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:34.586733 | orchestrator | 2026-01-06 00:41:34.586754 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-01-06 00:41:38.104614 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.139) 0:00:37.090 ******* 2026-01-06 00:41:38.104750 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2026-01-06 00:41:38.104774 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2026-01-06 00:41:38.104794 | orchestrator | 2026-01-06 00:41:38.104814 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-01-06 00:41:38.104834 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.123) 0:00:37.213 ******* 2026-01-06 00:41:38.104853 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.104873 | orchestrator | 2026-01-06 00:41:38.104891 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-01-06 00:41:38.104910 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.091) 0:00:37.305 ******* 2026-01-06 00:41:38.104929 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.104947 | orchestrator | 2026-01-06 00:41:38.104966 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-01-06 00:41:38.104985 | orchestrator | Tuesday 06 January 2026 00:41:34 +0000 (0:00:00.090) 0:00:37.396 ******* 2026-01-06 00:41:38.105004 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105023 | orchestrator | 2026-01-06 00:41:38.105044 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-01-06 00:41:38.105064 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.297) 0:00:37.694 ******* 2026-01-06 00:41:38.105083 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:41:38.105101 | orchestrator | 2026-01-06 00:41:38.105122 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-01-06 00:41:38.105141 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.285) 0:00:37.979 ******* 2026-01-06 00:41:38.105161 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0ba15c51-2e8d-5c95-884b-d45401cb60d9'}}) 2026-01-06 00:41:38.105179 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '588df21e-a0c0-57e7-8c43-2f77be274309'}}) 2026-01-06 00:41:38.105198 | orchestrator | 2026-01-06 00:41:38.105216 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-01-06 00:41:38.105234 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.131) 0:00:38.111 ******* 2026-01-06 00:41:38.105252 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0ba15c51-2e8d-5c95-884b-d45401cb60d9'}})  2026-01-06 00:41:38.105269 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '588df21e-a0c0-57e7-8c43-2f77be274309'}})  2026-01-06 00:41:38.105285 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105303 | orchestrator | 2026-01-06 00:41:38.105321 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-01-06 00:41:38.105339 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.114) 0:00:38.225 ******* 2026-01-06 00:41:38.105396 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0ba15c51-2e8d-5c95-884b-d45401cb60d9'}})  2026-01-06 00:41:38.105416 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '588df21e-a0c0-57e7-8c43-2f77be274309'}})  2026-01-06 00:41:38.105436 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105458 | orchestrator | 2026-01-06 00:41:38.105475 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-01-06 00:41:38.105492 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.139) 0:00:38.364 ******* 2026-01-06 00:41:38.105536 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0ba15c51-2e8d-5c95-884b-d45401cb60d9'}})  2026-01-06 00:41:38.105584 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '588df21e-a0c0-57e7-8c43-2f77be274309'}})  2026-01-06 00:41:38.105601 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105618 | orchestrator | 2026-01-06 00:41:38.105635 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-01-06 00:41:38.105651 | orchestrator | Tuesday 06 January 2026 00:41:35 +0000 (0:00:00.131) 0:00:38.496 ******* 2026-01-06 00:41:38.105668 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:41:38.105684 | orchestrator | 2026-01-06 00:41:38.105701 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-01-06 00:41:38.105717 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.101) 0:00:38.597 ******* 2026-01-06 00:41:38.105733 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:41:38.105751 | orchestrator | 2026-01-06 00:41:38.105769 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-01-06 00:41:38.105787 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.102) 0:00:38.700 ******* 2026-01-06 00:41:38.105806 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105824 | orchestrator | 2026-01-06 00:41:38.105842 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-01-06 00:41:38.105859 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.095) 0:00:38.795 ******* 2026-01-06 00:41:38.105877 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105895 | orchestrator | 2026-01-06 00:41:38.105913 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-01-06 00:41:38.105929 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.087) 0:00:38.882 ******* 2026-01-06 00:41:38.105945 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.105961 | orchestrator | 2026-01-06 00:41:38.105978 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-01-06 00:41:38.105995 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.094) 0:00:38.977 ******* 2026-01-06 00:41:38.106010 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:41:38.106106 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:38.106124 | orchestrator |  "sdb": { 2026-01-06 00:41:38.106171 | orchestrator |  "osd_lvm_uuid": "0ba15c51-2e8d-5c95-884b-d45401cb60d9" 2026-01-06 00:41:38.106190 | orchestrator |  }, 2026-01-06 00:41:38.106209 | orchestrator |  "sdc": { 2026-01-06 00:41:38.106227 | orchestrator |  "osd_lvm_uuid": "588df21e-a0c0-57e7-8c43-2f77be274309" 2026-01-06 00:41:38.106243 | orchestrator |  } 2026-01-06 00:41:38.106259 | orchestrator |  } 2026-01-06 00:41:38.106276 | orchestrator | } 2026-01-06 00:41:38.106294 | orchestrator | 2026-01-06 00:41:38.106311 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-01-06 00:41:38.106329 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.101) 0:00:39.079 ******* 2026-01-06 00:41:38.106347 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.106366 | orchestrator | 2026-01-06 00:41:38.106385 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-01-06 00:41:38.106404 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.092) 0:00:39.171 ******* 2026-01-06 00:41:38.106441 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.106460 | orchestrator | 2026-01-06 00:41:38.106479 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-01-06 00:41:38.106498 | orchestrator | Tuesday 06 January 2026 00:41:36 +0000 (0:00:00.244) 0:00:39.416 ******* 2026-01-06 00:41:38.106517 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:41:38.106536 | orchestrator | 2026-01-06 00:41:38.106616 | orchestrator | TASK [Print configuration data] ************************************************ 2026-01-06 00:41:38.106635 | orchestrator | Tuesday 06 January 2026 00:41:37 +0000 (0:00:00.111) 0:00:39.527 ******* 2026-01-06 00:41:38.106651 | orchestrator | changed: [testbed-node-5] => { 2026-01-06 00:41:38.106666 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-01-06 00:41:38.106683 | orchestrator |  "ceph_osd_devices": { 2026-01-06 00:41:38.106699 | orchestrator |  "sdb": { 2026-01-06 00:41:38.106716 | orchestrator |  "osd_lvm_uuid": "0ba15c51-2e8d-5c95-884b-d45401cb60d9" 2026-01-06 00:41:38.106733 | orchestrator |  }, 2026-01-06 00:41:38.106751 | orchestrator |  "sdc": { 2026-01-06 00:41:38.106768 | orchestrator |  "osd_lvm_uuid": "588df21e-a0c0-57e7-8c43-2f77be274309" 2026-01-06 00:41:38.106784 | orchestrator |  } 2026-01-06 00:41:38.106799 | orchestrator |  }, 2026-01-06 00:41:38.106815 | orchestrator |  "lvm_volumes": [ 2026-01-06 00:41:38.106832 | orchestrator |  { 2026-01-06 00:41:38.106849 | orchestrator |  "data": "osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9", 2026-01-06 00:41:38.106866 | orchestrator |  "data_vg": "ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9" 2026-01-06 00:41:38.106882 | orchestrator |  }, 2026-01-06 00:41:38.106898 | orchestrator |  { 2026-01-06 00:41:38.106910 | orchestrator |  "data": "osd-block-588df21e-a0c0-57e7-8c43-2f77be274309", 2026-01-06 00:41:38.106920 | orchestrator |  "data_vg": "ceph-588df21e-a0c0-57e7-8c43-2f77be274309" 2026-01-06 00:41:38.106930 | orchestrator |  } 2026-01-06 00:41:38.106944 | orchestrator |  ] 2026-01-06 00:41:38.106953 | orchestrator |  } 2026-01-06 00:41:38.106963 | orchestrator | } 2026-01-06 00:41:38.106973 | orchestrator | 2026-01-06 00:41:38.106983 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-01-06 00:41:38.106992 | orchestrator | Tuesday 06 January 2026 00:41:37 +0000 (0:00:00.155) 0:00:39.683 ******* 2026-01-06 00:41:38.107002 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-01-06 00:41:38.107012 | orchestrator | 2026-01-06 00:41:38.107021 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:41:38.107032 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-01-06 00:41:38.107042 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-01-06 00:41:38.107052 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-01-06 00:41:38.107081 | orchestrator | 2026-01-06 00:41:38.107091 | orchestrator | 2026-01-06 00:41:38.107101 | orchestrator | 2026-01-06 00:41:38.107110 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:41:38.107120 | orchestrator | Tuesday 06 January 2026 00:41:38 +0000 (0:00:00.900) 0:00:40.583 ******* 2026-01-06 00:41:38.107130 | orchestrator | =============================================================================== 2026-01-06 00:41:38.107139 | orchestrator | Write configuration file ------------------------------------------------ 3.71s 2026-01-06 00:41:38.107149 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 1.31s 2026-01-06 00:41:38.107158 | orchestrator | Add known links to the list of available block devices ------------------ 1.21s 2026-01-06 00:41:38.107168 | orchestrator | Add known partitions to the list of available block devices ------------- 1.09s 2026-01-06 00:41:38.107187 | orchestrator | Add known partitions to the list of available block devices ------------- 1.07s 2026-01-06 00:41:38.107197 | orchestrator | Add known partitions to the list of available block devices ------------- 0.96s 2026-01-06 00:41:38.107207 | orchestrator | Add known partitions to the list of available block devices ------------- 0.86s 2026-01-06 00:41:38.107216 | orchestrator | Get initial list of available block devices ----------------------------- 0.72s 2026-01-06 00:41:38.107226 | orchestrator | Add known links to the list of available block devices ------------------ 0.70s 2026-01-06 00:41:38.107235 | orchestrator | Add known partitions to the list of available block devices ------------- 0.70s 2026-01-06 00:41:38.107245 | orchestrator | Print configuration data ------------------------------------------------ 0.69s 2026-01-06 00:41:38.107255 | orchestrator | Generate lvm_volumes structure (block + wal) ---------------------------- 0.69s 2026-01-06 00:41:38.107265 | orchestrator | Add known links to the list of available block devices ------------------ 0.69s 2026-01-06 00:41:38.107285 | orchestrator | Set DB devices config data ---------------------------------------------- 0.67s 2026-01-06 00:41:38.330634 | orchestrator | Add known partitions to the list of available block devices ------------- 0.67s 2026-01-06 00:41:38.330714 | orchestrator | Generate shared DB/WAL VG names ----------------------------------------- 0.59s 2026-01-06 00:41:38.330860 | orchestrator | Define lvm_volumes structures ------------------------------------------- 0.58s 2026-01-06 00:41:38.330866 | orchestrator | Add known links to the list of available block devices ------------------ 0.56s 2026-01-06 00:41:38.330870 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.55s 2026-01-06 00:41:38.330875 | orchestrator | Add known links to the list of available block devices ------------------ 0.53s 2026-01-06 00:42:00.957380 | orchestrator | 2026-01-06 00:42:00 | INFO  | Task fe9bc3e5-2992-4191-9657-ec72c18dbd6b (sync inventory) is running in background. Output coming soon. 2026-01-06 00:42:27.834789 | orchestrator | 2026-01-06 00:42:02 | INFO  | Starting group_vars file reorganization 2026-01-06 00:42:27.834889 | orchestrator | 2026-01-06 00:42:02 | INFO  | Moved 0 file(s) to their respective directories 2026-01-06 00:42:27.834901 | orchestrator | 2026-01-06 00:42:02 | INFO  | Group_vars file reorganization completed 2026-01-06 00:42:27.834908 | orchestrator | 2026-01-06 00:42:05 | INFO  | Starting variable preparation from inventory 2026-01-06 00:42:27.834915 | orchestrator | 2026-01-06 00:42:08 | INFO  | Writing 050-kolla-ceph-rgw-hosts.yml with ceph_rgw_hosts 2026-01-06 00:42:27.834922 | orchestrator | 2026-01-06 00:42:08 | INFO  | Writing 050-infrastructure-cephclient-mons.yml with cephclient_mons 2026-01-06 00:42:27.834949 | orchestrator | 2026-01-06 00:42:08 | INFO  | Writing 050-ceph-cluster-fsid.yml with ceph_cluster_fsid 2026-01-06 00:42:27.834957 | orchestrator | 2026-01-06 00:42:08 | INFO  | 3 file(s) written, 6 host(s) processed 2026-01-06 00:42:27.834964 | orchestrator | 2026-01-06 00:42:08 | INFO  | Variable preparation completed 2026-01-06 00:42:27.834970 | orchestrator | 2026-01-06 00:42:09 | INFO  | Starting inventory overwrite handling 2026-01-06 00:42:27.834981 | orchestrator | 2026-01-06 00:42:09 | INFO  | Handling group overwrites in 99-overwrite 2026-01-06 00:42:27.834987 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removing group frr:children from 60-generic 2026-01-06 00:42:27.834994 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removing group netbird:children from 50-infrastructure 2026-01-06 00:42:27.835001 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removing group ceph-mds from 50-ceph 2026-01-06 00:42:27.835007 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removing group ceph-rgw from 50-ceph 2026-01-06 00:42:27.835014 | orchestrator | 2026-01-06 00:42:09 | INFO  | Handling group overwrites in 20-roles 2026-01-06 00:42:27.835043 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removing group k3s_node from 50-infrastructure 2026-01-06 00:42:27.835050 | orchestrator | 2026-01-06 00:42:09 | INFO  | Removed 5 group(s) in total 2026-01-06 00:42:27.835056 | orchestrator | 2026-01-06 00:42:09 | INFO  | Inventory overwrite handling completed 2026-01-06 00:42:27.835063 | orchestrator | 2026-01-06 00:42:10 | INFO  | Starting merge of inventory files 2026-01-06 00:42:27.835069 | orchestrator | 2026-01-06 00:42:10 | INFO  | Inventory files merged successfully 2026-01-06 00:42:27.835075 | orchestrator | 2026-01-06 00:42:15 | INFO  | Generating ClusterShell configuration from Ansible inventory 2026-01-06 00:42:27.835082 | orchestrator | 2026-01-06 00:42:26 | INFO  | Successfully wrote ClusterShell configuration 2026-01-06 00:42:27.835088 | orchestrator | [master f0e3a1f] 2026-01-06-00-42 2026-01-06 00:42:27.835096 | orchestrator | 1 file changed, 30 insertions(+), 9 deletions(-) 2026-01-06 00:42:30.651410 | orchestrator | 2026-01-06 00:42:30 | INFO  | Task 04398ef0-6ce9-4bf5-853d-276cee3b4774 (ceph-create-lvm-devices) was prepared for execution. 2026-01-06 00:42:30.651591 | orchestrator | 2026-01-06 00:42:30 | INFO  | It takes a moment until task 04398ef0-6ce9-4bf5-853d-276cee3b4774 (ceph-create-lvm-devices) has been started and output is visible here. 2026-01-06 00:42:41.872371 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-01-06 00:42:41.872562 | orchestrator | 2.16.14 2026-01-06 00:42:41.872586 | orchestrator | 2026-01-06 00:42:41.872602 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-01-06 00:42:41.872618 | orchestrator | 2026-01-06 00:42:41.872632 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:42:41.872647 | orchestrator | Tuesday 06 January 2026 00:42:34 +0000 (0:00:00.264) 0:00:00.264 ******* 2026-01-06 00:42:41.872662 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-01-06 00:42:41.872677 | orchestrator | 2026-01-06 00:42:41.872692 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:42:41.872707 | orchestrator | Tuesday 06 January 2026 00:42:34 +0000 (0:00:00.205) 0:00:00.470 ******* 2026-01-06 00:42:41.872723 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:41.872739 | orchestrator | 2026-01-06 00:42:41.872757 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.872773 | orchestrator | Tuesday 06 January 2026 00:42:35 +0000 (0:00:00.194) 0:00:00.664 ******* 2026-01-06 00:42:41.872788 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-01-06 00:42:41.872803 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-01-06 00:42:41.872818 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-01-06 00:42:41.872832 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-01-06 00:42:41.872847 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-01-06 00:42:41.872861 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-01-06 00:42:41.872876 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-01-06 00:42:41.872891 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-01-06 00:42:41.872906 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-01-06 00:42:41.872918 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-01-06 00:42:41.872931 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-01-06 00:42:41.872944 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-01-06 00:42:41.872982 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-01-06 00:42:41.872994 | orchestrator | 2026-01-06 00:42:41.873005 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873017 | orchestrator | Tuesday 06 January 2026 00:42:35 +0000 (0:00:00.455) 0:00:01.119 ******* 2026-01-06 00:42:41.873028 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873041 | orchestrator | 2026-01-06 00:42:41.873053 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873066 | orchestrator | Tuesday 06 January 2026 00:42:35 +0000 (0:00:00.189) 0:00:01.309 ******* 2026-01-06 00:42:41.873077 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873089 | orchestrator | 2026-01-06 00:42:41.873102 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873114 | orchestrator | Tuesday 06 January 2026 00:42:35 +0000 (0:00:00.185) 0:00:01.494 ******* 2026-01-06 00:42:41.873126 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873139 | orchestrator | 2026-01-06 00:42:41.873151 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873163 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.190) 0:00:01.685 ******* 2026-01-06 00:42:41.873176 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873188 | orchestrator | 2026-01-06 00:42:41.873201 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873213 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.179) 0:00:01.865 ******* 2026-01-06 00:42:41.873225 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873238 | orchestrator | 2026-01-06 00:42:41.873249 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873261 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.187) 0:00:02.053 ******* 2026-01-06 00:42:41.873273 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873285 | orchestrator | 2026-01-06 00:42:41.873297 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873310 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.197) 0:00:02.250 ******* 2026-01-06 00:42:41.873322 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873335 | orchestrator | 2026-01-06 00:42:41.873347 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873359 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.198) 0:00:02.449 ******* 2026-01-06 00:42:41.873371 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873383 | orchestrator | 2026-01-06 00:42:41.873394 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873407 | orchestrator | Tuesday 06 January 2026 00:42:36 +0000 (0:00:00.190) 0:00:02.639 ******* 2026-01-06 00:42:41.873419 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907) 2026-01-06 00:42:41.873429 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907) 2026-01-06 00:42:41.873436 | orchestrator | 2026-01-06 00:42:41.873443 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873468 | orchestrator | Tuesday 06 January 2026 00:42:37 +0000 (0:00:00.428) 0:00:03.068 ******* 2026-01-06 00:42:41.873475 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604) 2026-01-06 00:42:41.873482 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604) 2026-01-06 00:42:41.873534 | orchestrator | 2026-01-06 00:42:41.873541 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873548 | orchestrator | Tuesday 06 January 2026 00:42:38 +0000 (0:00:00.665) 0:00:03.733 ******* 2026-01-06 00:42:41.873555 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac) 2026-01-06 00:42:41.873570 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac) 2026-01-06 00:42:41.873577 | orchestrator | 2026-01-06 00:42:41.873584 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873591 | orchestrator | Tuesday 06 January 2026 00:42:38 +0000 (0:00:00.727) 0:00:04.461 ******* 2026-01-06 00:42:41.873598 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088) 2026-01-06 00:42:41.873604 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088) 2026-01-06 00:42:41.873611 | orchestrator | 2026-01-06 00:42:41.873618 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:42:41.873624 | orchestrator | Tuesday 06 January 2026 00:42:39 +0000 (0:00:00.865) 0:00:05.326 ******* 2026-01-06 00:42:41.873631 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:42:41.873638 | orchestrator | 2026-01-06 00:42:41.873645 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873651 | orchestrator | Tuesday 06 January 2026 00:42:40 +0000 (0:00:00.323) 0:00:05.649 ******* 2026-01-06 00:42:41.873658 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-01-06 00:42:41.873664 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-01-06 00:42:41.873671 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-01-06 00:42:41.873697 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-01-06 00:42:41.873704 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-01-06 00:42:41.873710 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-01-06 00:42:41.873717 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-01-06 00:42:41.873723 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-01-06 00:42:41.873730 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-01-06 00:42:41.873736 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-01-06 00:42:41.873743 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-01-06 00:42:41.873753 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-01-06 00:42:41.873760 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-01-06 00:42:41.873767 | orchestrator | 2026-01-06 00:42:41.873774 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873780 | orchestrator | Tuesday 06 January 2026 00:42:40 +0000 (0:00:00.403) 0:00:06.052 ******* 2026-01-06 00:42:41.873787 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873794 | orchestrator | 2026-01-06 00:42:41.873800 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873807 | orchestrator | Tuesday 06 January 2026 00:42:40 +0000 (0:00:00.232) 0:00:06.285 ******* 2026-01-06 00:42:41.873814 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873820 | orchestrator | 2026-01-06 00:42:41.873827 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873834 | orchestrator | Tuesday 06 January 2026 00:42:40 +0000 (0:00:00.180) 0:00:06.466 ******* 2026-01-06 00:42:41.873840 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873847 | orchestrator | 2026-01-06 00:42:41.873853 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873860 | orchestrator | Tuesday 06 January 2026 00:42:41 +0000 (0:00:00.201) 0:00:06.667 ******* 2026-01-06 00:42:41.873872 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873878 | orchestrator | 2026-01-06 00:42:41.873885 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873891 | orchestrator | Tuesday 06 January 2026 00:42:41 +0000 (0:00:00.221) 0:00:06.889 ******* 2026-01-06 00:42:41.873898 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873905 | orchestrator | 2026-01-06 00:42:41.873912 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873918 | orchestrator | Tuesday 06 January 2026 00:42:41 +0000 (0:00:00.202) 0:00:07.091 ******* 2026-01-06 00:42:41.873925 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873932 | orchestrator | 2026-01-06 00:42:41.873938 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:41.873945 | orchestrator | Tuesday 06 January 2026 00:42:41 +0000 (0:00:00.202) 0:00:07.294 ******* 2026-01-06 00:42:41.873952 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:41.873958 | orchestrator | 2026-01-06 00:42:41.873976 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616413 | orchestrator | Tuesday 06 January 2026 00:42:41 +0000 (0:00:00.203) 0:00:07.498 ******* 2026-01-06 00:42:50.616591 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616607 | orchestrator | 2026-01-06 00:42:50.616617 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616627 | orchestrator | Tuesday 06 January 2026 00:42:42 +0000 (0:00:00.207) 0:00:07.705 ******* 2026-01-06 00:42:50.616636 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-01-06 00:42:50.616645 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-01-06 00:42:50.616655 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-01-06 00:42:50.616664 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-01-06 00:42:50.616672 | orchestrator | 2026-01-06 00:42:50.616682 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616691 | orchestrator | Tuesday 06 January 2026 00:42:43 +0000 (0:00:01.112) 0:00:08.818 ******* 2026-01-06 00:42:50.616700 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616709 | orchestrator | 2026-01-06 00:42:50.616718 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616727 | orchestrator | Tuesday 06 January 2026 00:42:43 +0000 (0:00:00.219) 0:00:09.037 ******* 2026-01-06 00:42:50.616746 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616756 | orchestrator | 2026-01-06 00:42:50.616765 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616775 | orchestrator | Tuesday 06 January 2026 00:42:43 +0000 (0:00:00.226) 0:00:09.264 ******* 2026-01-06 00:42:50.616784 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616793 | orchestrator | 2026-01-06 00:42:50.616802 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:42:50.616811 | orchestrator | Tuesday 06 January 2026 00:42:43 +0000 (0:00:00.241) 0:00:09.505 ******* 2026-01-06 00:42:50.616820 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616828 | orchestrator | 2026-01-06 00:42:50.616837 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-01-06 00:42:50.616846 | orchestrator | Tuesday 06 January 2026 00:42:44 +0000 (0:00:00.229) 0:00:09.734 ******* 2026-01-06 00:42:50.616855 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.616863 | orchestrator | 2026-01-06 00:42:50.616872 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-01-06 00:42:50.616881 | orchestrator | Tuesday 06 January 2026 00:42:44 +0000 (0:00:00.141) 0:00:09.876 ******* 2026-01-06 00:42:50.616891 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'd44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}}) 2026-01-06 00:42:50.616900 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '1f440738-8941-5354-ae19-38cd939f8e8b'}}) 2026-01-06 00:42:50.616909 | orchestrator | 2026-01-06 00:42:50.616918 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-01-06 00:42:50.616954 | orchestrator | Tuesday 06 January 2026 00:42:44 +0000 (0:00:00.218) 0:00:10.095 ******* 2026-01-06 00:42:50.616966 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}) 2026-01-06 00:42:50.616978 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'}) 2026-01-06 00:42:50.616988 | orchestrator | 2026-01-06 00:42:50.616999 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-01-06 00:42:50.617009 | orchestrator | Tuesday 06 January 2026 00:42:46 +0000 (0:00:02.065) 0:00:12.160 ******* 2026-01-06 00:42:50.617020 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617032 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617043 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617053 | orchestrator | 2026-01-06 00:42:50.617064 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-01-06 00:42:50.617074 | orchestrator | Tuesday 06 January 2026 00:42:46 +0000 (0:00:00.162) 0:00:12.322 ******* 2026-01-06 00:42:50.617085 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}) 2026-01-06 00:42:50.617096 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'}) 2026-01-06 00:42:50.617106 | orchestrator | 2026-01-06 00:42:50.617117 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-01-06 00:42:50.617127 | orchestrator | Tuesday 06 January 2026 00:42:48 +0000 (0:00:01.571) 0:00:13.894 ******* 2026-01-06 00:42:50.617137 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617148 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617158 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617168 | orchestrator | 2026-01-06 00:42:50.617178 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-01-06 00:42:50.617190 | orchestrator | Tuesday 06 January 2026 00:42:48 +0000 (0:00:00.215) 0:00:14.109 ******* 2026-01-06 00:42:50.617219 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617229 | orchestrator | 2026-01-06 00:42:50.617240 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-01-06 00:42:50.617250 | orchestrator | Tuesday 06 January 2026 00:42:48 +0000 (0:00:00.168) 0:00:14.277 ******* 2026-01-06 00:42:50.617260 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617271 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617282 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617292 | orchestrator | 2026-01-06 00:42:50.617301 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-01-06 00:42:50.617309 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.475) 0:00:14.753 ******* 2026-01-06 00:42:50.617318 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617327 | orchestrator | 2026-01-06 00:42:50.617335 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-01-06 00:42:50.617344 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.145) 0:00:14.898 ******* 2026-01-06 00:42:50.617360 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617370 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617378 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617387 | orchestrator | 2026-01-06 00:42:50.617396 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-01-06 00:42:50.617405 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.170) 0:00:15.069 ******* 2026-01-06 00:42:50.617414 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617423 | orchestrator | 2026-01-06 00:42:50.617431 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-01-06 00:42:50.617440 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.167) 0:00:15.237 ******* 2026-01-06 00:42:50.617449 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617458 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617466 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617475 | orchestrator | 2026-01-06 00:42:50.617512 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-01-06 00:42:50.617521 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.196) 0:00:15.433 ******* 2026-01-06 00:42:50.617530 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:50.617539 | orchestrator | 2026-01-06 00:42:50.617547 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-01-06 00:42:50.617576 | orchestrator | Tuesday 06 January 2026 00:42:49 +0000 (0:00:00.171) 0:00:15.605 ******* 2026-01-06 00:42:50.617590 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617599 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617608 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617617 | orchestrator | 2026-01-06 00:42:50.617625 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-01-06 00:42:50.617634 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.151) 0:00:15.756 ******* 2026-01-06 00:42:50.617643 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617651 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617660 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617668 | orchestrator | 2026-01-06 00:42:50.617677 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-01-06 00:42:50.617686 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.187) 0:00:15.944 ******* 2026-01-06 00:42:50.617695 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:50.617703 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:50.617712 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617721 | orchestrator | 2026-01-06 00:42:50.617729 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-01-06 00:42:50.617745 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.166) 0:00:16.110 ******* 2026-01-06 00:42:50.617753 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:50.617762 | orchestrator | 2026-01-06 00:42:50.617771 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-01-06 00:42:50.617785 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.140) 0:00:16.250 ******* 2026-01-06 00:42:58.116124 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116226 | orchestrator | 2026-01-06 00:42:58.116236 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-01-06 00:42:58.116243 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.161) 0:00:16.411 ******* 2026-01-06 00:42:58.116249 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116254 | orchestrator | 2026-01-06 00:42:58.116260 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-01-06 00:42:58.116266 | orchestrator | Tuesday 06 January 2026 00:42:50 +0000 (0:00:00.145) 0:00:16.557 ******* 2026-01-06 00:42:58.116271 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:42:58.116280 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-01-06 00:42:58.116289 | orchestrator | } 2026-01-06 00:42:58.116298 | orchestrator | 2026-01-06 00:42:58.116306 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-01-06 00:42:58.116316 | orchestrator | Tuesday 06 January 2026 00:42:51 +0000 (0:00:00.373) 0:00:16.930 ******* 2026-01-06 00:42:58.116322 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:42:58.116329 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-01-06 00:42:58.116338 | orchestrator | } 2026-01-06 00:42:58.116348 | orchestrator | 2026-01-06 00:42:58.116357 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-01-06 00:42:58.116366 | orchestrator | Tuesday 06 January 2026 00:42:51 +0000 (0:00:00.152) 0:00:17.083 ******* 2026-01-06 00:42:58.116376 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:42:58.116384 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-01-06 00:42:58.116394 | orchestrator | } 2026-01-06 00:42:58.116403 | orchestrator | 2026-01-06 00:42:58.116411 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-01-06 00:42:58.116420 | orchestrator | Tuesday 06 January 2026 00:42:51 +0000 (0:00:00.219) 0:00:17.303 ******* 2026-01-06 00:42:58.116428 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:58.116433 | orchestrator | 2026-01-06 00:42:58.116438 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-01-06 00:42:58.116444 | orchestrator | Tuesday 06 January 2026 00:42:52 +0000 (0:00:00.709) 0:00:18.012 ******* 2026-01-06 00:42:58.116449 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:58.116455 | orchestrator | 2026-01-06 00:42:58.116464 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-01-06 00:42:58.116535 | orchestrator | Tuesday 06 January 2026 00:42:52 +0000 (0:00:00.566) 0:00:18.579 ******* 2026-01-06 00:42:58.116547 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:58.116557 | orchestrator | 2026-01-06 00:42:58.116567 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-01-06 00:42:58.116576 | orchestrator | Tuesday 06 January 2026 00:42:53 +0000 (0:00:00.541) 0:00:19.121 ******* 2026-01-06 00:42:58.116587 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:42:58.116597 | orchestrator | 2026-01-06 00:42:58.116607 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-01-06 00:42:58.116617 | orchestrator | Tuesday 06 January 2026 00:42:53 +0000 (0:00:00.183) 0:00:19.304 ******* 2026-01-06 00:42:58.116627 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116635 | orchestrator | 2026-01-06 00:42:58.116644 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-01-06 00:42:58.116654 | orchestrator | Tuesday 06 January 2026 00:42:53 +0000 (0:00:00.113) 0:00:19.418 ******* 2026-01-06 00:42:58.116663 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116673 | orchestrator | 2026-01-06 00:42:58.116684 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-01-06 00:42:58.116736 | orchestrator | Tuesday 06 January 2026 00:42:53 +0000 (0:00:00.127) 0:00:19.545 ******* 2026-01-06 00:42:58.116747 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:42:58.116758 | orchestrator |  "vgs_report": { 2026-01-06 00:42:58.116769 | orchestrator |  "vg": [] 2026-01-06 00:42:58.116780 | orchestrator |  } 2026-01-06 00:42:58.116789 | orchestrator | } 2026-01-06 00:42:58.116798 | orchestrator | 2026-01-06 00:42:58.116807 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-01-06 00:42:58.116816 | orchestrator | Tuesday 06 January 2026 00:42:54 +0000 (0:00:00.171) 0:00:19.717 ******* 2026-01-06 00:42:58.116825 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116834 | orchestrator | 2026-01-06 00:42:58.116840 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-01-06 00:42:58.116855 | orchestrator | Tuesday 06 January 2026 00:42:54 +0000 (0:00:00.148) 0:00:19.866 ******* 2026-01-06 00:42:58.116860 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116865 | orchestrator | 2026-01-06 00:42:58.116870 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-01-06 00:42:58.116875 | orchestrator | Tuesday 06 January 2026 00:42:54 +0000 (0:00:00.160) 0:00:20.026 ******* 2026-01-06 00:42:58.116880 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116885 | orchestrator | 2026-01-06 00:42:58.116897 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-01-06 00:42:58.116902 | orchestrator | Tuesday 06 January 2026 00:42:54 +0000 (0:00:00.607) 0:00:20.634 ******* 2026-01-06 00:42:58.116907 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116912 | orchestrator | 2026-01-06 00:42:58.116918 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-01-06 00:42:58.116923 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.174) 0:00:20.808 ******* 2026-01-06 00:42:58.116928 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116933 | orchestrator | 2026-01-06 00:42:58.116938 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-01-06 00:42:58.116943 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.184) 0:00:20.993 ******* 2026-01-06 00:42:58.116948 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116953 | orchestrator | 2026-01-06 00:42:58.116958 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-01-06 00:42:58.116963 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.194) 0:00:21.188 ******* 2026-01-06 00:42:58.116968 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.116973 | orchestrator | 2026-01-06 00:42:58.116978 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-01-06 00:42:58.116983 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.136) 0:00:21.324 ******* 2026-01-06 00:42:58.117004 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117010 | orchestrator | 2026-01-06 00:42:58.117015 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-01-06 00:42:58.117020 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.149) 0:00:21.474 ******* 2026-01-06 00:42:58.117026 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117035 | orchestrator | 2026-01-06 00:42:58.117044 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-01-06 00:42:58.117053 | orchestrator | Tuesday 06 January 2026 00:42:55 +0000 (0:00:00.150) 0:00:21.624 ******* 2026-01-06 00:42:58.117061 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117070 | orchestrator | 2026-01-06 00:42:58.117078 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-01-06 00:42:58.117088 | orchestrator | Tuesday 06 January 2026 00:42:56 +0000 (0:00:00.159) 0:00:21.783 ******* 2026-01-06 00:42:58.117097 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117106 | orchestrator | 2026-01-06 00:42:58.117115 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-01-06 00:42:58.117123 | orchestrator | Tuesday 06 January 2026 00:42:56 +0000 (0:00:00.154) 0:00:21.937 ******* 2026-01-06 00:42:58.117142 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117152 | orchestrator | 2026-01-06 00:42:58.117162 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-01-06 00:42:58.117172 | orchestrator | Tuesday 06 January 2026 00:42:56 +0000 (0:00:00.140) 0:00:22.078 ******* 2026-01-06 00:42:58.117183 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117196 | orchestrator | 2026-01-06 00:42:58.117210 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-01-06 00:42:58.117223 | orchestrator | Tuesday 06 January 2026 00:42:56 +0000 (0:00:00.132) 0:00:22.211 ******* 2026-01-06 00:42:58.117232 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117241 | orchestrator | 2026-01-06 00:42:58.117254 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-01-06 00:42:58.117267 | orchestrator | Tuesday 06 January 2026 00:42:56 +0000 (0:00:00.151) 0:00:22.362 ******* 2026-01-06 00:42:58.117279 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:58.117290 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:58.117299 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117309 | orchestrator | 2026-01-06 00:42:58.117318 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-01-06 00:42:58.117327 | orchestrator | Tuesday 06 January 2026 00:42:57 +0000 (0:00:00.499) 0:00:22.862 ******* 2026-01-06 00:42:58.117337 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:58.117347 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:58.117356 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117365 | orchestrator | 2026-01-06 00:42:58.117375 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-01-06 00:42:58.117385 | orchestrator | Tuesday 06 January 2026 00:42:57 +0000 (0:00:00.167) 0:00:23.030 ******* 2026-01-06 00:42:58.117395 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:58.117405 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:58.117415 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117425 | orchestrator | 2026-01-06 00:42:58.117435 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-01-06 00:42:58.117445 | orchestrator | Tuesday 06 January 2026 00:42:57 +0000 (0:00:00.195) 0:00:23.225 ******* 2026-01-06 00:42:58.117455 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:58.117465 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:58.117523 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117533 | orchestrator | 2026-01-06 00:42:58.117544 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-01-06 00:42:58.117554 | orchestrator | Tuesday 06 January 2026 00:42:57 +0000 (0:00:00.155) 0:00:23.381 ******* 2026-01-06 00:42:58.117564 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:42:58.117575 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:42:58.117592 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:42:58.117601 | orchestrator | 2026-01-06 00:42:58.117611 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-01-06 00:42:58.117630 | orchestrator | Tuesday 06 January 2026 00:42:57 +0000 (0:00:00.171) 0:00:23.553 ******* 2026-01-06 00:42:58.117648 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.769332 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.769452 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.769532 | orchestrator | 2026-01-06 00:43:03.769548 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-01-06 00:43:03.769561 | orchestrator | Tuesday 06 January 2026 00:42:58 +0000 (0:00:00.192) 0:00:23.745 ******* 2026-01-06 00:43:03.769573 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.769585 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.769596 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.769607 | orchestrator | 2026-01-06 00:43:03.769618 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-01-06 00:43:03.769629 | orchestrator | Tuesday 06 January 2026 00:42:58 +0000 (0:00:00.203) 0:00:23.949 ******* 2026-01-06 00:43:03.769641 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.769652 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.769663 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.769674 | orchestrator | 2026-01-06 00:43:03.769685 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-01-06 00:43:03.769696 | orchestrator | Tuesday 06 January 2026 00:42:58 +0000 (0:00:00.159) 0:00:24.108 ******* 2026-01-06 00:43:03.769707 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:43:03.769719 | orchestrator | 2026-01-06 00:43:03.769730 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-01-06 00:43:03.769741 | orchestrator | Tuesday 06 January 2026 00:42:59 +0000 (0:00:00.565) 0:00:24.674 ******* 2026-01-06 00:43:03.769752 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:43:03.769763 | orchestrator | 2026-01-06 00:43:03.769774 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-01-06 00:43:03.769785 | orchestrator | Tuesday 06 January 2026 00:42:59 +0000 (0:00:00.549) 0:00:25.223 ******* 2026-01-06 00:43:03.769796 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:43:03.769806 | orchestrator | 2026-01-06 00:43:03.769831 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-01-06 00:43:03.769845 | orchestrator | Tuesday 06 January 2026 00:42:59 +0000 (0:00:00.155) 0:00:25.379 ******* 2026-01-06 00:43:03.769871 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'vg_name': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'}) 2026-01-06 00:43:03.769908 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'vg_name': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}) 2026-01-06 00:43:03.769921 | orchestrator | 2026-01-06 00:43:03.769935 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-01-06 00:43:03.769948 | orchestrator | Tuesday 06 January 2026 00:42:59 +0000 (0:00:00.193) 0:00:25.572 ******* 2026-01-06 00:43:03.769990 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.770010 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.770098 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.770119 | orchestrator | 2026-01-06 00:43:03.770138 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-01-06 00:43:03.770159 | orchestrator | Tuesday 06 January 2026 00:43:00 +0000 (0:00:00.465) 0:00:26.038 ******* 2026-01-06 00:43:03.770179 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.770198 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.770210 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.770222 | orchestrator | 2026-01-06 00:43:03.770233 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-01-06 00:43:03.770244 | orchestrator | Tuesday 06 January 2026 00:43:00 +0000 (0:00:00.159) 0:00:26.197 ******* 2026-01-06 00:43:03.770256 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'})  2026-01-06 00:43:03.770267 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'})  2026-01-06 00:43:03.770277 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:43:03.770289 | orchestrator | 2026-01-06 00:43:03.770299 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-01-06 00:43:03.770311 | orchestrator | Tuesday 06 January 2026 00:43:00 +0000 (0:00:00.167) 0:00:26.365 ******* 2026-01-06 00:43:03.770343 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 00:43:03.770355 | orchestrator |  "lvm_report": { 2026-01-06 00:43:03.770367 | orchestrator |  "lv": [ 2026-01-06 00:43:03.770378 | orchestrator |  { 2026-01-06 00:43:03.770389 | orchestrator |  "lv_name": "osd-block-1f440738-8941-5354-ae19-38cd939f8e8b", 2026-01-06 00:43:03.770401 | orchestrator |  "vg_name": "ceph-1f440738-8941-5354-ae19-38cd939f8e8b" 2026-01-06 00:43:03.770412 | orchestrator |  }, 2026-01-06 00:43:03.770423 | orchestrator |  { 2026-01-06 00:43:03.770434 | orchestrator |  "lv_name": "osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382", 2026-01-06 00:43:03.770445 | orchestrator |  "vg_name": "ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382" 2026-01-06 00:43:03.770456 | orchestrator |  } 2026-01-06 00:43:03.770485 | orchestrator |  ], 2026-01-06 00:43:03.770497 | orchestrator |  "pv": [ 2026-01-06 00:43:03.770508 | orchestrator |  { 2026-01-06 00:43:03.770519 | orchestrator |  "pv_name": "/dev/sdb", 2026-01-06 00:43:03.770530 | orchestrator |  "vg_name": "ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382" 2026-01-06 00:43:03.770540 | orchestrator |  }, 2026-01-06 00:43:03.770551 | orchestrator |  { 2026-01-06 00:43:03.770562 | orchestrator |  "pv_name": "/dev/sdc", 2026-01-06 00:43:03.770573 | orchestrator |  "vg_name": "ceph-1f440738-8941-5354-ae19-38cd939f8e8b" 2026-01-06 00:43:03.770583 | orchestrator |  } 2026-01-06 00:43:03.770594 | orchestrator |  ] 2026-01-06 00:43:03.770605 | orchestrator |  } 2026-01-06 00:43:03.770617 | orchestrator | } 2026-01-06 00:43:03.770628 | orchestrator | 2026-01-06 00:43:03.770639 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-01-06 00:43:03.770650 | orchestrator | 2026-01-06 00:43:03.770661 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:43:03.770683 | orchestrator | Tuesday 06 January 2026 00:43:01 +0000 (0:00:00.295) 0:00:26.660 ******* 2026-01-06 00:43:03.770694 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-01-06 00:43:03.770705 | orchestrator | 2026-01-06 00:43:03.770716 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:43:03.770727 | orchestrator | Tuesday 06 January 2026 00:43:01 +0000 (0:00:00.263) 0:00:26.923 ******* 2026-01-06 00:43:03.770738 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:03.770749 | orchestrator | 2026-01-06 00:43:03.770760 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.770771 | orchestrator | Tuesday 06 January 2026 00:43:01 +0000 (0:00:00.237) 0:00:27.161 ******* 2026-01-06 00:43:03.770782 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-01-06 00:43:03.770792 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-01-06 00:43:03.770803 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-01-06 00:43:03.770814 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-01-06 00:43:03.770825 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-01-06 00:43:03.770836 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-01-06 00:43:03.770856 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-01-06 00:43:03.770867 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-01-06 00:43:03.770878 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-01-06 00:43:03.770889 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-01-06 00:43:03.770900 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-01-06 00:43:03.770910 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-01-06 00:43:03.770921 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-01-06 00:43:03.770932 | orchestrator | 2026-01-06 00:43:03.770943 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.770954 | orchestrator | Tuesday 06 January 2026 00:43:01 +0000 (0:00:00.427) 0:00:27.589 ******* 2026-01-06 00:43:03.770965 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.770984 | orchestrator | 2026-01-06 00:43:03.771034 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.771055 | orchestrator | Tuesday 06 January 2026 00:43:02 +0000 (0:00:00.199) 0:00:27.788 ******* 2026-01-06 00:43:03.771073 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.771088 | orchestrator | 2026-01-06 00:43:03.771099 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.771110 | orchestrator | Tuesday 06 January 2026 00:43:02 +0000 (0:00:00.222) 0:00:28.011 ******* 2026-01-06 00:43:03.771121 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.771142 | orchestrator | 2026-01-06 00:43:03.771154 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.771164 | orchestrator | Tuesday 06 January 2026 00:43:03 +0000 (0:00:00.712) 0:00:28.723 ******* 2026-01-06 00:43:03.771175 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.771186 | orchestrator | 2026-01-06 00:43:03.771197 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.771208 | orchestrator | Tuesday 06 January 2026 00:43:03 +0000 (0:00:00.234) 0:00:28.958 ******* 2026-01-06 00:43:03.771219 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.771229 | orchestrator | 2026-01-06 00:43:03.771240 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:03.771260 | orchestrator | Tuesday 06 January 2026 00:43:03 +0000 (0:00:00.220) 0:00:29.178 ******* 2026-01-06 00:43:03.771271 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:03.771281 | orchestrator | 2026-01-06 00:43:03.771302 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.270951 | orchestrator | Tuesday 06 January 2026 00:43:03 +0000 (0:00:00.227) 0:00:29.406 ******* 2026-01-06 00:43:15.271051 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271063 | orchestrator | 2026-01-06 00:43:15.271071 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271078 | orchestrator | Tuesday 06 January 2026 00:43:03 +0000 (0:00:00.235) 0:00:29.642 ******* 2026-01-06 00:43:15.271085 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271091 | orchestrator | 2026-01-06 00:43:15.271098 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271105 | orchestrator | Tuesday 06 January 2026 00:43:04 +0000 (0:00:00.200) 0:00:29.842 ******* 2026-01-06 00:43:15.271111 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a) 2026-01-06 00:43:15.271119 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a) 2026-01-06 00:43:15.271126 | orchestrator | 2026-01-06 00:43:15.271132 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271139 | orchestrator | Tuesday 06 January 2026 00:43:04 +0000 (0:00:00.426) 0:00:30.269 ******* 2026-01-06 00:43:15.271145 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585) 2026-01-06 00:43:15.271151 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585) 2026-01-06 00:43:15.271157 | orchestrator | 2026-01-06 00:43:15.271164 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271170 | orchestrator | Tuesday 06 January 2026 00:43:05 +0000 (0:00:00.430) 0:00:30.700 ******* 2026-01-06 00:43:15.271176 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6) 2026-01-06 00:43:15.271183 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6) 2026-01-06 00:43:15.271189 | orchestrator | 2026-01-06 00:43:15.271195 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271201 | orchestrator | Tuesday 06 January 2026 00:43:05 +0000 (0:00:00.454) 0:00:31.155 ******* 2026-01-06 00:43:15.271208 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6) 2026-01-06 00:43:15.271214 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6) 2026-01-06 00:43:15.271220 | orchestrator | 2026-01-06 00:43:15.271227 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:15.271233 | orchestrator | Tuesday 06 January 2026 00:43:06 +0000 (0:00:00.665) 0:00:31.820 ******* 2026-01-06 00:43:15.271239 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:43:15.271245 | orchestrator | 2026-01-06 00:43:15.271252 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271258 | orchestrator | Tuesday 06 January 2026 00:43:06 +0000 (0:00:00.585) 0:00:32.406 ******* 2026-01-06 00:43:15.271264 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-01-06 00:43:15.271272 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-01-06 00:43:15.271278 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-01-06 00:43:15.271284 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-01-06 00:43:15.271291 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-01-06 00:43:15.271345 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-01-06 00:43:15.271358 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-01-06 00:43:15.271369 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-01-06 00:43:15.271380 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-01-06 00:43:15.271389 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-01-06 00:43:15.271399 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-01-06 00:43:15.271410 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-01-06 00:43:15.271420 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-01-06 00:43:15.271431 | orchestrator | 2026-01-06 00:43:15.271442 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271452 | orchestrator | Tuesday 06 January 2026 00:43:07 +0000 (0:00:00.706) 0:00:33.112 ******* 2026-01-06 00:43:15.271484 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271495 | orchestrator | 2026-01-06 00:43:15.271506 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271517 | orchestrator | Tuesday 06 January 2026 00:43:07 +0000 (0:00:00.222) 0:00:33.334 ******* 2026-01-06 00:43:15.271528 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271539 | orchestrator | 2026-01-06 00:43:15.271550 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271561 | orchestrator | Tuesday 06 January 2026 00:43:07 +0000 (0:00:00.207) 0:00:33.541 ******* 2026-01-06 00:43:15.271572 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271582 | orchestrator | 2026-01-06 00:43:15.271612 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271623 | orchestrator | Tuesday 06 January 2026 00:43:08 +0000 (0:00:00.188) 0:00:33.730 ******* 2026-01-06 00:43:15.271634 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271645 | orchestrator | 2026-01-06 00:43:15.271656 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271667 | orchestrator | Tuesday 06 January 2026 00:43:08 +0000 (0:00:00.185) 0:00:33.916 ******* 2026-01-06 00:43:15.271678 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271689 | orchestrator | 2026-01-06 00:43:15.271700 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271711 | orchestrator | Tuesday 06 January 2026 00:43:08 +0000 (0:00:00.208) 0:00:34.124 ******* 2026-01-06 00:43:15.271722 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271733 | orchestrator | 2026-01-06 00:43:15.271743 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271753 | orchestrator | Tuesday 06 January 2026 00:43:08 +0000 (0:00:00.223) 0:00:34.347 ******* 2026-01-06 00:43:15.271763 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271773 | orchestrator | 2026-01-06 00:43:15.271784 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271795 | orchestrator | Tuesday 06 January 2026 00:43:08 +0000 (0:00:00.217) 0:00:34.564 ******* 2026-01-06 00:43:15.271806 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271817 | orchestrator | 2026-01-06 00:43:15.271828 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271839 | orchestrator | Tuesday 06 January 2026 00:43:09 +0000 (0:00:00.216) 0:00:34.781 ******* 2026-01-06 00:43:15.271849 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-01-06 00:43:15.271859 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-01-06 00:43:15.271870 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-01-06 00:43:15.271879 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-01-06 00:43:15.271900 | orchestrator | 2026-01-06 00:43:15.271911 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271921 | orchestrator | Tuesday 06 January 2026 00:43:10 +0000 (0:00:00.888) 0:00:35.670 ******* 2026-01-06 00:43:15.271931 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271942 | orchestrator | 2026-01-06 00:43:15.271952 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.271964 | orchestrator | Tuesday 06 January 2026 00:43:10 +0000 (0:00:00.215) 0:00:35.885 ******* 2026-01-06 00:43:15.271975 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.271986 | orchestrator | 2026-01-06 00:43:15.272054 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.272063 | orchestrator | Tuesday 06 January 2026 00:43:11 +0000 (0:00:00.817) 0:00:36.703 ******* 2026-01-06 00:43:15.272070 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.272076 | orchestrator | 2026-01-06 00:43:15.272082 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:15.272088 | orchestrator | Tuesday 06 January 2026 00:43:11 +0000 (0:00:00.213) 0:00:36.916 ******* 2026-01-06 00:43:15.272094 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.272101 | orchestrator | 2026-01-06 00:43:15.272107 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-01-06 00:43:15.272124 | orchestrator | Tuesday 06 January 2026 00:43:11 +0000 (0:00:00.221) 0:00:37.138 ******* 2026-01-06 00:43:15.272136 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.272143 | orchestrator | 2026-01-06 00:43:15.272164 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-01-06 00:43:15.272171 | orchestrator | Tuesday 06 January 2026 00:43:11 +0000 (0:00:00.142) 0:00:37.281 ******* 2026-01-06 00:43:15.272178 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '64d6825f-3ec1-5927-8c89-e441ee427e8a'}}) 2026-01-06 00:43:15.272185 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e675238b-4f6c-5157-bfd7-95a1b3a689b7'}}) 2026-01-06 00:43:15.272192 | orchestrator | 2026-01-06 00:43:15.272202 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-01-06 00:43:15.272214 | orchestrator | Tuesday 06 January 2026 00:43:11 +0000 (0:00:00.192) 0:00:37.474 ******* 2026-01-06 00:43:15.272222 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'}) 2026-01-06 00:43:15.272230 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'}) 2026-01-06 00:43:15.272239 | orchestrator | 2026-01-06 00:43:15.272250 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-01-06 00:43:15.272256 | orchestrator | Tuesday 06 January 2026 00:43:13 +0000 (0:00:01.869) 0:00:39.344 ******* 2026-01-06 00:43:15.272263 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:15.272270 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:15.272277 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:15.272283 | orchestrator | 2026-01-06 00:43:15.272289 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-01-06 00:43:15.272295 | orchestrator | Tuesday 06 January 2026 00:43:13 +0000 (0:00:00.162) 0:00:39.506 ******* 2026-01-06 00:43:15.272303 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'}) 2026-01-06 00:43:15.272325 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'}) 2026-01-06 00:43:21.352181 | orchestrator | 2026-01-06 00:43:21.352292 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-01-06 00:43:21.352307 | orchestrator | Tuesday 06 January 2026 00:43:15 +0000 (0:00:01.395) 0:00:40.901 ******* 2026-01-06 00:43:21.352318 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352329 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352339 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352348 | orchestrator | 2026-01-06 00:43:21.352358 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-01-06 00:43:21.352367 | orchestrator | Tuesday 06 January 2026 00:43:15 +0000 (0:00:00.165) 0:00:41.067 ******* 2026-01-06 00:43:21.352376 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352385 | orchestrator | 2026-01-06 00:43:21.352394 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-01-06 00:43:21.352403 | orchestrator | Tuesday 06 January 2026 00:43:15 +0000 (0:00:00.151) 0:00:41.218 ******* 2026-01-06 00:43:21.352412 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352421 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352430 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352438 | orchestrator | 2026-01-06 00:43:21.352447 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-01-06 00:43:21.352525 | orchestrator | Tuesday 06 January 2026 00:43:15 +0000 (0:00:00.169) 0:00:41.387 ******* 2026-01-06 00:43:21.352535 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352544 | orchestrator | 2026-01-06 00:43:21.352553 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-01-06 00:43:21.352562 | orchestrator | Tuesday 06 January 2026 00:43:15 +0000 (0:00:00.172) 0:00:41.560 ******* 2026-01-06 00:43:21.352571 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352581 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352590 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352599 | orchestrator | 2026-01-06 00:43:21.352608 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-01-06 00:43:21.352636 | orchestrator | Tuesday 06 January 2026 00:43:16 +0000 (0:00:00.406) 0:00:41.966 ******* 2026-01-06 00:43:21.352645 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352654 | orchestrator | 2026-01-06 00:43:21.352663 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-01-06 00:43:21.352672 | orchestrator | Tuesday 06 January 2026 00:43:16 +0000 (0:00:00.167) 0:00:42.134 ******* 2026-01-06 00:43:21.352681 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352690 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352699 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352709 | orchestrator | 2026-01-06 00:43:21.352719 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-01-06 00:43:21.352730 | orchestrator | Tuesday 06 January 2026 00:43:16 +0000 (0:00:00.169) 0:00:42.304 ******* 2026-01-06 00:43:21.352741 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:21.352774 | orchestrator | 2026-01-06 00:43:21.352786 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-01-06 00:43:21.352796 | orchestrator | Tuesday 06 January 2026 00:43:16 +0000 (0:00:00.165) 0:00:42.470 ******* 2026-01-06 00:43:21.352806 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352817 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352827 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352838 | orchestrator | 2026-01-06 00:43:21.352848 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-01-06 00:43:21.352858 | orchestrator | Tuesday 06 January 2026 00:43:16 +0000 (0:00:00.162) 0:00:42.633 ******* 2026-01-06 00:43:21.352869 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352879 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352889 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352900 | orchestrator | 2026-01-06 00:43:21.352910 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-01-06 00:43:21.352938 | orchestrator | Tuesday 06 January 2026 00:43:17 +0000 (0:00:00.167) 0:00:42.800 ******* 2026-01-06 00:43:21.352949 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:21.352958 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:21.352967 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.352976 | orchestrator | 2026-01-06 00:43:21.352985 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-01-06 00:43:21.352994 | orchestrator | Tuesday 06 January 2026 00:43:17 +0000 (0:00:00.197) 0:00:42.998 ******* 2026-01-06 00:43:21.353003 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353011 | orchestrator | 2026-01-06 00:43:21.353020 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-01-06 00:43:21.353029 | orchestrator | Tuesday 06 January 2026 00:43:17 +0000 (0:00:00.216) 0:00:43.214 ******* 2026-01-06 00:43:21.353038 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353046 | orchestrator | 2026-01-06 00:43:21.353055 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-01-06 00:43:21.353064 | orchestrator | Tuesday 06 January 2026 00:43:17 +0000 (0:00:00.141) 0:00:43.355 ******* 2026-01-06 00:43:21.353073 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353081 | orchestrator | 2026-01-06 00:43:21.353090 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-01-06 00:43:21.353099 | orchestrator | Tuesday 06 January 2026 00:43:17 +0000 (0:00:00.156) 0:00:43.512 ******* 2026-01-06 00:43:21.353108 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:43:21.353116 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-01-06 00:43:21.353125 | orchestrator | } 2026-01-06 00:43:21.353134 | orchestrator | 2026-01-06 00:43:21.353143 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-01-06 00:43:21.353152 | orchestrator | Tuesday 06 January 2026 00:43:18 +0000 (0:00:00.156) 0:00:43.669 ******* 2026-01-06 00:43:21.353160 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:43:21.353170 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-01-06 00:43:21.353178 | orchestrator | } 2026-01-06 00:43:21.353187 | orchestrator | 2026-01-06 00:43:21.353196 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-01-06 00:43:21.353205 | orchestrator | Tuesday 06 January 2026 00:43:18 +0000 (0:00:00.150) 0:00:43.819 ******* 2026-01-06 00:43:21.353220 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:43:21.353230 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-01-06 00:43:21.353239 | orchestrator | } 2026-01-06 00:43:21.353247 | orchestrator | 2026-01-06 00:43:21.353256 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-01-06 00:43:21.353265 | orchestrator | Tuesday 06 January 2026 00:43:18 +0000 (0:00:00.453) 0:00:44.273 ******* 2026-01-06 00:43:21.353274 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:21.353282 | orchestrator | 2026-01-06 00:43:21.353291 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-01-06 00:43:21.353300 | orchestrator | Tuesday 06 January 2026 00:43:19 +0000 (0:00:00.535) 0:00:44.808 ******* 2026-01-06 00:43:21.353309 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:21.353317 | orchestrator | 2026-01-06 00:43:21.353326 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-01-06 00:43:21.353335 | orchestrator | Tuesday 06 January 2026 00:43:19 +0000 (0:00:00.528) 0:00:45.337 ******* 2026-01-06 00:43:21.353344 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:21.353352 | orchestrator | 2026-01-06 00:43:21.353362 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-01-06 00:43:21.353370 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.530) 0:00:45.868 ******* 2026-01-06 00:43:21.353379 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:21.353388 | orchestrator | 2026-01-06 00:43:21.353396 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-01-06 00:43:21.353405 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.156) 0:00:46.024 ******* 2026-01-06 00:43:21.353414 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353422 | orchestrator | 2026-01-06 00:43:21.353438 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-01-06 00:43:21.353447 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.124) 0:00:46.149 ******* 2026-01-06 00:43:21.353475 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353485 | orchestrator | 2026-01-06 00:43:21.353494 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-01-06 00:43:21.353503 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.118) 0:00:46.268 ******* 2026-01-06 00:43:21.353511 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:43:21.353521 | orchestrator |  "vgs_report": { 2026-01-06 00:43:21.353530 | orchestrator |  "vg": [] 2026-01-06 00:43:21.353539 | orchestrator |  } 2026-01-06 00:43:21.353548 | orchestrator | } 2026-01-06 00:43:21.353557 | orchestrator | 2026-01-06 00:43:21.353566 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-01-06 00:43:21.353575 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.136) 0:00:46.405 ******* 2026-01-06 00:43:21.353583 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353592 | orchestrator | 2026-01-06 00:43:21.353601 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-01-06 00:43:21.353610 | orchestrator | Tuesday 06 January 2026 00:43:20 +0000 (0:00:00.142) 0:00:46.547 ******* 2026-01-06 00:43:21.353619 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353628 | orchestrator | 2026-01-06 00:43:21.353637 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-01-06 00:43:21.353646 | orchestrator | Tuesday 06 January 2026 00:43:21 +0000 (0:00:00.159) 0:00:46.706 ******* 2026-01-06 00:43:21.353654 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353663 | orchestrator | 2026-01-06 00:43:21.353672 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-01-06 00:43:21.353681 | orchestrator | Tuesday 06 January 2026 00:43:21 +0000 (0:00:00.138) 0:00:46.845 ******* 2026-01-06 00:43:21.353690 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:21.353699 | orchestrator | 2026-01-06 00:43:21.353714 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-01-06 00:43:26.491657 | orchestrator | Tuesday 06 January 2026 00:43:21 +0000 (0:00:00.143) 0:00:46.988 ******* 2026-01-06 00:43:26.491779 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491790 | orchestrator | 2026-01-06 00:43:26.491798 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-01-06 00:43:26.491805 | orchestrator | Tuesday 06 January 2026 00:43:21 +0000 (0:00:00.390) 0:00:47.379 ******* 2026-01-06 00:43:26.491811 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491818 | orchestrator | 2026-01-06 00:43:26.491824 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-01-06 00:43:26.491831 | orchestrator | Tuesday 06 January 2026 00:43:21 +0000 (0:00:00.140) 0:00:47.519 ******* 2026-01-06 00:43:26.491837 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491843 | orchestrator | 2026-01-06 00:43:26.491849 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-01-06 00:43:26.491856 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.164) 0:00:47.683 ******* 2026-01-06 00:43:26.491862 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491868 | orchestrator | 2026-01-06 00:43:26.491874 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-01-06 00:43:26.491881 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.160) 0:00:47.844 ******* 2026-01-06 00:43:26.491887 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491893 | orchestrator | 2026-01-06 00:43:26.491899 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-01-06 00:43:26.491905 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.146) 0:00:47.991 ******* 2026-01-06 00:43:26.491912 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491918 | orchestrator | 2026-01-06 00:43:26.491924 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-01-06 00:43:26.491930 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.161) 0:00:48.152 ******* 2026-01-06 00:43:26.491937 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491943 | orchestrator | 2026-01-06 00:43:26.491949 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-01-06 00:43:26.491955 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.135) 0:00:48.288 ******* 2026-01-06 00:43:26.491962 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491968 | orchestrator | 2026-01-06 00:43:26.491974 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-01-06 00:43:26.491980 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.147) 0:00:48.436 ******* 2026-01-06 00:43:26.491986 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.491992 | orchestrator | 2026-01-06 00:43:26.491999 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-01-06 00:43:26.492005 | orchestrator | Tuesday 06 January 2026 00:43:22 +0000 (0:00:00.152) 0:00:48.589 ******* 2026-01-06 00:43:26.492011 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492018 | orchestrator | 2026-01-06 00:43:26.492024 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-01-06 00:43:26.492044 | orchestrator | Tuesday 06 January 2026 00:43:23 +0000 (0:00:00.136) 0:00:48.726 ******* 2026-01-06 00:43:26.492051 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492059 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492065 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492071 | orchestrator | 2026-01-06 00:43:26.492078 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-01-06 00:43:26.492084 | orchestrator | Tuesday 06 January 2026 00:43:23 +0000 (0:00:00.159) 0:00:48.885 ******* 2026-01-06 00:43:26.492090 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492104 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492111 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492117 | orchestrator | 2026-01-06 00:43:26.492123 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-01-06 00:43:26.492129 | orchestrator | Tuesday 06 January 2026 00:43:23 +0000 (0:00:00.179) 0:00:49.064 ******* 2026-01-06 00:43:26.492136 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492142 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492148 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492155 | orchestrator | 2026-01-06 00:43:26.492161 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-01-06 00:43:26.492167 | orchestrator | Tuesday 06 January 2026 00:43:23 +0000 (0:00:00.163) 0:00:49.228 ******* 2026-01-06 00:43:26.492174 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492180 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492186 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492192 | orchestrator | 2026-01-06 00:43:26.492213 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-01-06 00:43:26.492220 | orchestrator | Tuesday 06 January 2026 00:43:23 +0000 (0:00:00.357) 0:00:49.585 ******* 2026-01-06 00:43:26.492226 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492232 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492239 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492245 | orchestrator | 2026-01-06 00:43:26.492251 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-01-06 00:43:26.492258 | orchestrator | Tuesday 06 January 2026 00:43:24 +0000 (0:00:00.142) 0:00:49.728 ******* 2026-01-06 00:43:26.492265 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492276 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492286 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492297 | orchestrator | 2026-01-06 00:43:26.492308 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-01-06 00:43:26.492318 | orchestrator | Tuesday 06 January 2026 00:43:24 +0000 (0:00:00.183) 0:00:49.911 ******* 2026-01-06 00:43:26.492328 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492338 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492348 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492358 | orchestrator | 2026-01-06 00:43:26.492369 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-01-06 00:43:26.492380 | orchestrator | Tuesday 06 January 2026 00:43:24 +0000 (0:00:00.177) 0:00:50.089 ******* 2026-01-06 00:43:26.492399 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492414 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492426 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492436 | orchestrator | 2026-01-06 00:43:26.492465 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-01-06 00:43:26.492477 | orchestrator | Tuesday 06 January 2026 00:43:24 +0000 (0:00:00.156) 0:00:50.245 ******* 2026-01-06 00:43:26.492487 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:26.492499 | orchestrator | 2026-01-06 00:43:26.492505 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-01-06 00:43:26.492511 | orchestrator | Tuesday 06 January 2026 00:43:25 +0000 (0:00:00.542) 0:00:50.788 ******* 2026-01-06 00:43:26.492518 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:26.492524 | orchestrator | 2026-01-06 00:43:26.492530 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-01-06 00:43:26.492536 | orchestrator | Tuesday 06 January 2026 00:43:25 +0000 (0:00:00.627) 0:00:51.416 ******* 2026-01-06 00:43:26.492542 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:43:26.492548 | orchestrator | 2026-01-06 00:43:26.492554 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-01-06 00:43:26.492561 | orchestrator | Tuesday 06 January 2026 00:43:25 +0000 (0:00:00.175) 0:00:51.591 ******* 2026-01-06 00:43:26.492567 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'vg_name': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'}) 2026-01-06 00:43:26.492575 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'vg_name': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'}) 2026-01-06 00:43:26.492581 | orchestrator | 2026-01-06 00:43:26.492587 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-01-06 00:43:26.492593 | orchestrator | Tuesday 06 January 2026 00:43:26 +0000 (0:00:00.193) 0:00:51.784 ******* 2026-01-06 00:43:26.492599 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492606 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:26.492612 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:26.492618 | orchestrator | 2026-01-06 00:43:26.492624 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-01-06 00:43:26.492630 | orchestrator | Tuesday 06 January 2026 00:43:26 +0000 (0:00:00.175) 0:00:51.960 ******* 2026-01-06 00:43:26.492637 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:26.492649 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:33.088843 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:33.088964 | orchestrator | 2026-01-06 00:43:33.088982 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-01-06 00:43:33.088995 | orchestrator | Tuesday 06 January 2026 00:43:26 +0000 (0:00:00.167) 0:00:52.127 ******* 2026-01-06 00:43:33.089005 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'})  2026-01-06 00:43:33.089016 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'})  2026-01-06 00:43:33.089026 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:43:33.089062 | orchestrator | 2026-01-06 00:43:33.089073 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-01-06 00:43:33.089082 | orchestrator | Tuesday 06 January 2026 00:43:26 +0000 (0:00:00.159) 0:00:52.286 ******* 2026-01-06 00:43:33.089091 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 00:43:33.089100 | orchestrator |  "lvm_report": { 2026-01-06 00:43:33.089110 | orchestrator |  "lv": [ 2026-01-06 00:43:33.089119 | orchestrator |  { 2026-01-06 00:43:33.089129 | orchestrator |  "lv_name": "osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a", 2026-01-06 00:43:33.089140 | orchestrator |  "vg_name": "ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a" 2026-01-06 00:43:33.089149 | orchestrator |  }, 2026-01-06 00:43:33.089158 | orchestrator |  { 2026-01-06 00:43:33.089167 | orchestrator |  "lv_name": "osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7", 2026-01-06 00:43:33.089176 | orchestrator |  "vg_name": "ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7" 2026-01-06 00:43:33.089186 | orchestrator |  } 2026-01-06 00:43:33.089195 | orchestrator |  ], 2026-01-06 00:43:33.089204 | orchestrator |  "pv": [ 2026-01-06 00:43:33.089213 | orchestrator |  { 2026-01-06 00:43:33.089223 | orchestrator |  "pv_name": "/dev/sdb", 2026-01-06 00:43:33.089233 | orchestrator |  "vg_name": "ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a" 2026-01-06 00:43:33.089242 | orchestrator |  }, 2026-01-06 00:43:33.089252 | orchestrator |  { 2026-01-06 00:43:33.089261 | orchestrator |  "pv_name": "/dev/sdc", 2026-01-06 00:43:33.089270 | orchestrator |  "vg_name": "ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7" 2026-01-06 00:43:33.089278 | orchestrator |  } 2026-01-06 00:43:33.089309 | orchestrator |  ] 2026-01-06 00:43:33.089319 | orchestrator |  } 2026-01-06 00:43:33.089339 | orchestrator | } 2026-01-06 00:43:33.089359 | orchestrator | 2026-01-06 00:43:33.089369 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-01-06 00:43:33.089390 | orchestrator | 2026-01-06 00:43:33.089400 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-01-06 00:43:33.089421 | orchestrator | Tuesday 06 January 2026 00:43:27 +0000 (0:00:00.505) 0:00:52.792 ******* 2026-01-06 00:43:33.089432 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-01-06 00:43:33.089441 | orchestrator | 2026-01-06 00:43:33.089528 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-01-06 00:43:33.089539 | orchestrator | Tuesday 06 January 2026 00:43:27 +0000 (0:00:00.270) 0:00:53.063 ******* 2026-01-06 00:43:33.089549 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:33.089558 | orchestrator | 2026-01-06 00:43:33.089567 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.089576 | orchestrator | Tuesday 06 January 2026 00:43:27 +0000 (0:00:00.250) 0:00:53.313 ******* 2026-01-06 00:43:33.089585 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-01-06 00:43:33.089627 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-01-06 00:43:33.089640 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-01-06 00:43:33.089672 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-01-06 00:43:33.089683 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-01-06 00:43:33.089694 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-01-06 00:43:33.089704 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-01-06 00:43:33.089723 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-01-06 00:43:33.089733 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-01-06 00:43:33.089756 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-01-06 00:43:33.089777 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-01-06 00:43:33.089799 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-01-06 00:43:33.089818 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-01-06 00:43:33.089829 | orchestrator | 2026-01-06 00:43:33.089843 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.089864 | orchestrator | Tuesday 06 January 2026 00:43:28 +0000 (0:00:00.440) 0:00:53.754 ******* 2026-01-06 00:43:33.089883 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.089892 | orchestrator | 2026-01-06 00:43:33.089901 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.089910 | orchestrator | Tuesday 06 January 2026 00:43:28 +0000 (0:00:00.229) 0:00:53.983 ******* 2026-01-06 00:43:33.089919 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.089927 | orchestrator | 2026-01-06 00:43:33.089937 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090159 | orchestrator | Tuesday 06 January 2026 00:43:28 +0000 (0:00:00.227) 0:00:54.210 ******* 2026-01-06 00:43:33.090177 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090187 | orchestrator | 2026-01-06 00:43:33.090197 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090207 | orchestrator | Tuesday 06 January 2026 00:43:28 +0000 (0:00:00.208) 0:00:54.419 ******* 2026-01-06 00:43:33.090216 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090225 | orchestrator | 2026-01-06 00:43:33.090235 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090299 | orchestrator | Tuesday 06 January 2026 00:43:28 +0000 (0:00:00.209) 0:00:54.629 ******* 2026-01-06 00:43:33.090308 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090317 | orchestrator | 2026-01-06 00:43:33.090325 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090334 | orchestrator | Tuesday 06 January 2026 00:43:29 +0000 (0:00:00.219) 0:00:54.848 ******* 2026-01-06 00:43:33.090343 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090351 | orchestrator | 2026-01-06 00:43:33.090360 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090369 | orchestrator | Tuesday 06 January 2026 00:43:29 +0000 (0:00:00.770) 0:00:55.619 ******* 2026-01-06 00:43:33.090378 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090386 | orchestrator | 2026-01-06 00:43:33.090395 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090405 | orchestrator | Tuesday 06 January 2026 00:43:30 +0000 (0:00:00.205) 0:00:55.825 ******* 2026-01-06 00:43:33.090414 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:33.090423 | orchestrator | 2026-01-06 00:43:33.090432 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090441 | orchestrator | Tuesday 06 January 2026 00:43:30 +0000 (0:00:00.203) 0:00:56.028 ******* 2026-01-06 00:43:33.090485 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22) 2026-01-06 00:43:33.090496 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22) 2026-01-06 00:43:33.090505 | orchestrator | 2026-01-06 00:43:33.090514 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090523 | orchestrator | Tuesday 06 January 2026 00:43:30 +0000 (0:00:00.416) 0:00:56.445 ******* 2026-01-06 00:43:33.090532 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5) 2026-01-06 00:43:33.090542 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5) 2026-01-06 00:43:33.090551 | orchestrator | 2026-01-06 00:43:33.090572 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090587 | orchestrator | Tuesday 06 January 2026 00:43:31 +0000 (0:00:00.494) 0:00:56.939 ******* 2026-01-06 00:43:33.090596 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3) 2026-01-06 00:43:33.090605 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3) 2026-01-06 00:43:33.090613 | orchestrator | 2026-01-06 00:43:33.090622 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090627 | orchestrator | Tuesday 06 January 2026 00:43:31 +0000 (0:00:00.458) 0:00:57.398 ******* 2026-01-06 00:43:33.090631 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59) 2026-01-06 00:43:33.090636 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59) 2026-01-06 00:43:33.090641 | orchestrator | 2026-01-06 00:43:33.090646 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-01-06 00:43:33.090651 | orchestrator | Tuesday 06 January 2026 00:43:32 +0000 (0:00:00.493) 0:00:57.891 ******* 2026-01-06 00:43:33.090656 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-01-06 00:43:33.090660 | orchestrator | 2026-01-06 00:43:33.090665 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:33.090670 | orchestrator | Tuesday 06 January 2026 00:43:32 +0000 (0:00:00.383) 0:00:58.275 ******* 2026-01-06 00:43:33.090675 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-01-06 00:43:33.090680 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-01-06 00:43:33.090684 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-01-06 00:43:33.090689 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-01-06 00:43:33.090694 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-01-06 00:43:33.090699 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-01-06 00:43:33.090704 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-01-06 00:43:33.090709 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-01-06 00:43:33.090714 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-01-06 00:43:33.090718 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-01-06 00:43:33.090723 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-01-06 00:43:33.090740 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-01-06 00:43:42.829809 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-01-06 00:43:42.829933 | orchestrator | 2026-01-06 00:43:42.829950 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.829963 | orchestrator | Tuesday 06 January 2026 00:43:33 +0000 (0:00:00.438) 0:00:58.714 ******* 2026-01-06 00:43:42.829974 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.829986 | orchestrator | 2026-01-06 00:43:42.829998 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830009 | orchestrator | Tuesday 06 January 2026 00:43:33 +0000 (0:00:00.236) 0:00:58.950 ******* 2026-01-06 00:43:42.830084 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830096 | orchestrator | 2026-01-06 00:43:42.830144 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830157 | orchestrator | Tuesday 06 January 2026 00:43:34 +0000 (0:00:00.715) 0:00:59.666 ******* 2026-01-06 00:43:42.830194 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830206 | orchestrator | 2026-01-06 00:43:42.830216 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830227 | orchestrator | Tuesday 06 January 2026 00:43:34 +0000 (0:00:00.228) 0:00:59.895 ******* 2026-01-06 00:43:42.830238 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830249 | orchestrator | 2026-01-06 00:43:42.830260 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830271 | orchestrator | Tuesday 06 January 2026 00:43:34 +0000 (0:00:00.207) 0:01:00.102 ******* 2026-01-06 00:43:42.830282 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830293 | orchestrator | 2026-01-06 00:43:42.830304 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830315 | orchestrator | Tuesday 06 January 2026 00:43:34 +0000 (0:00:00.196) 0:01:00.299 ******* 2026-01-06 00:43:42.830328 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830341 | orchestrator | 2026-01-06 00:43:42.830355 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830368 | orchestrator | Tuesday 06 January 2026 00:43:34 +0000 (0:00:00.247) 0:01:00.546 ******* 2026-01-06 00:43:42.830381 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830391 | orchestrator | 2026-01-06 00:43:42.830402 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830413 | orchestrator | Tuesday 06 January 2026 00:43:35 +0000 (0:00:00.212) 0:01:00.759 ******* 2026-01-06 00:43:42.830424 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830474 | orchestrator | 2026-01-06 00:43:42.830488 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830499 | orchestrator | Tuesday 06 January 2026 00:43:35 +0000 (0:00:00.210) 0:01:00.969 ******* 2026-01-06 00:43:42.830529 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-01-06 00:43:42.830541 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-01-06 00:43:42.830552 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-01-06 00:43:42.830563 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-01-06 00:43:42.830574 | orchestrator | 2026-01-06 00:43:42.830584 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830595 | orchestrator | Tuesday 06 January 2026 00:43:36 +0000 (0:00:00.732) 0:01:01.702 ******* 2026-01-06 00:43:42.830606 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830617 | orchestrator | 2026-01-06 00:43:42.830628 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830639 | orchestrator | Tuesday 06 January 2026 00:43:36 +0000 (0:00:00.209) 0:01:01.911 ******* 2026-01-06 00:43:42.830650 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830661 | orchestrator | 2026-01-06 00:43:42.830672 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830683 | orchestrator | Tuesday 06 January 2026 00:43:36 +0000 (0:00:00.193) 0:01:02.105 ******* 2026-01-06 00:43:42.830693 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830704 | orchestrator | 2026-01-06 00:43:42.830715 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-01-06 00:43:42.830726 | orchestrator | Tuesday 06 January 2026 00:43:36 +0000 (0:00:00.217) 0:01:02.322 ******* 2026-01-06 00:43:42.830736 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830747 | orchestrator | 2026-01-06 00:43:42.830758 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-01-06 00:43:42.830769 | orchestrator | Tuesday 06 January 2026 00:43:36 +0000 (0:00:00.199) 0:01:02.521 ******* 2026-01-06 00:43:42.830780 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.830791 | orchestrator | 2026-01-06 00:43:42.830801 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-01-06 00:43:42.830812 | orchestrator | Tuesday 06 January 2026 00:43:37 +0000 (0:00:00.389) 0:01:02.911 ******* 2026-01-06 00:43:42.830823 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '0ba15c51-2e8d-5c95-884b-d45401cb60d9'}}) 2026-01-06 00:43:42.830843 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '588df21e-a0c0-57e7-8c43-2f77be274309'}}) 2026-01-06 00:43:42.830854 | orchestrator | 2026-01-06 00:43:42.830865 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-01-06 00:43:42.830876 | orchestrator | Tuesday 06 January 2026 00:43:37 +0000 (0:00:00.212) 0:01:03.123 ******* 2026-01-06 00:43:42.830888 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'}) 2026-01-06 00:43:42.830900 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'}) 2026-01-06 00:43:42.830911 | orchestrator | 2026-01-06 00:43:42.830922 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-01-06 00:43:42.830954 | orchestrator | Tuesday 06 January 2026 00:43:39 +0000 (0:00:01.958) 0:01:05.082 ******* 2026-01-06 00:43:42.830966 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:42.830978 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:42.830989 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831000 | orchestrator | 2026-01-06 00:43:42.831011 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-01-06 00:43:42.831022 | orchestrator | Tuesday 06 January 2026 00:43:39 +0000 (0:00:00.192) 0:01:05.274 ******* 2026-01-06 00:43:42.831033 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'}) 2026-01-06 00:43:42.831044 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'}) 2026-01-06 00:43:42.831055 | orchestrator | 2026-01-06 00:43:42.831065 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-01-06 00:43:42.831076 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:01.409) 0:01:06.683 ******* 2026-01-06 00:43:42.831087 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:42.831098 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:42.831109 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831120 | orchestrator | 2026-01-06 00:43:42.831131 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-01-06 00:43:42.831142 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:00.166) 0:01:06.850 ******* 2026-01-06 00:43:42.831153 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831164 | orchestrator | 2026-01-06 00:43:42.831175 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-01-06 00:43:42.831186 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:00.157) 0:01:07.007 ******* 2026-01-06 00:43:42.831202 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:42.831213 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:42.831224 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831235 | orchestrator | 2026-01-06 00:43:42.831246 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-01-06 00:43:42.831263 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:00.183) 0:01:07.191 ******* 2026-01-06 00:43:42.831274 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831285 | orchestrator | 2026-01-06 00:43:42.831296 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-01-06 00:43:42.831307 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:00.143) 0:01:07.334 ******* 2026-01-06 00:43:42.831318 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:42.831329 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:42.831340 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831351 | orchestrator | 2026-01-06 00:43:42.831362 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-01-06 00:43:42.831373 | orchestrator | Tuesday 06 January 2026 00:43:41 +0000 (0:00:00.208) 0:01:07.542 ******* 2026-01-06 00:43:42.831384 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831395 | orchestrator | 2026-01-06 00:43:42.831406 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-01-06 00:43:42.831417 | orchestrator | Tuesday 06 January 2026 00:43:42 +0000 (0:00:00.175) 0:01:07.718 ******* 2026-01-06 00:43:42.831428 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:42.831466 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:42.831478 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:42.831489 | orchestrator | 2026-01-06 00:43:42.831499 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-01-06 00:43:42.831510 | orchestrator | Tuesday 06 January 2026 00:43:42 +0000 (0:00:00.177) 0:01:07.896 ******* 2026-01-06 00:43:42.831521 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:42.831532 | orchestrator | 2026-01-06 00:43:42.831543 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-01-06 00:43:42.831554 | orchestrator | Tuesday 06 January 2026 00:43:42 +0000 (0:00:00.397) 0:01:08.293 ******* 2026-01-06 00:43:42.831573 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:49.649363 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:49.649524 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649540 | orchestrator | 2026-01-06 00:43:49.649550 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-01-06 00:43:49.649561 | orchestrator | Tuesday 06 January 2026 00:43:42 +0000 (0:00:00.171) 0:01:08.464 ******* 2026-01-06 00:43:49.649571 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:49.649581 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:49.649590 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649599 | orchestrator | 2026-01-06 00:43:49.649608 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-01-06 00:43:49.649617 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.186) 0:01:08.651 ******* 2026-01-06 00:43:49.649626 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:49.649635 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:49.649668 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649678 | orchestrator | 2026-01-06 00:43:49.649686 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-01-06 00:43:49.649695 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.161) 0:01:08.813 ******* 2026-01-06 00:43:49.649704 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649713 | orchestrator | 2026-01-06 00:43:49.649722 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-01-06 00:43:49.649730 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.152) 0:01:08.966 ******* 2026-01-06 00:43:49.649739 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649748 | orchestrator | 2026-01-06 00:43:49.649756 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-01-06 00:43:49.649765 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.149) 0:01:09.115 ******* 2026-01-06 00:43:49.649774 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.649783 | orchestrator | 2026-01-06 00:43:49.649792 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-01-06 00:43:49.649801 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.137) 0:01:09.252 ******* 2026-01-06 00:43:49.649810 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:43:49.649819 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-01-06 00:43:49.649828 | orchestrator | } 2026-01-06 00:43:49.649837 | orchestrator | 2026-01-06 00:43:49.649846 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-01-06 00:43:49.649854 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.182) 0:01:09.435 ******* 2026-01-06 00:43:49.649863 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:43:49.649872 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-01-06 00:43:49.649881 | orchestrator | } 2026-01-06 00:43:49.649890 | orchestrator | 2026-01-06 00:43:49.649899 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-01-06 00:43:49.649907 | orchestrator | Tuesday 06 January 2026 00:43:43 +0000 (0:00:00.158) 0:01:09.593 ******* 2026-01-06 00:43:49.649916 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:43:49.649925 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-01-06 00:43:49.649934 | orchestrator | } 2026-01-06 00:43:49.649943 | orchestrator | 2026-01-06 00:43:49.649951 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-01-06 00:43:49.649960 | orchestrator | Tuesday 06 January 2026 00:43:44 +0000 (0:00:00.151) 0:01:09.745 ******* 2026-01-06 00:43:49.649969 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:49.649978 | orchestrator | 2026-01-06 00:43:49.649986 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-01-06 00:43:49.649995 | orchestrator | Tuesday 06 January 2026 00:43:44 +0000 (0:00:00.651) 0:01:10.396 ******* 2026-01-06 00:43:49.650004 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:49.650066 | orchestrator | 2026-01-06 00:43:49.650078 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-01-06 00:43:49.650087 | orchestrator | Tuesday 06 January 2026 00:43:45 +0000 (0:00:00.541) 0:01:10.937 ******* 2026-01-06 00:43:49.650095 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:49.650104 | orchestrator | 2026-01-06 00:43:49.650113 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-01-06 00:43:49.650122 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.914) 0:01:11.852 ******* 2026-01-06 00:43:49.650130 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:49.650139 | orchestrator | 2026-01-06 00:43:49.650148 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-01-06 00:43:49.650156 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.154) 0:01:12.006 ******* 2026-01-06 00:43:49.650165 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650174 | orchestrator | 2026-01-06 00:43:49.650182 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-01-06 00:43:49.650200 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.126) 0:01:12.133 ******* 2026-01-06 00:43:49.650209 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650218 | orchestrator | 2026-01-06 00:43:49.650227 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-01-06 00:43:49.650254 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.140) 0:01:12.274 ******* 2026-01-06 00:43:49.650264 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:43:49.650273 | orchestrator |  "vgs_report": { 2026-01-06 00:43:49.650282 | orchestrator |  "vg": [] 2026-01-06 00:43:49.650308 | orchestrator |  } 2026-01-06 00:43:49.650318 | orchestrator | } 2026-01-06 00:43:49.650327 | orchestrator | 2026-01-06 00:43:49.650336 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-01-06 00:43:49.650345 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.146) 0:01:12.420 ******* 2026-01-06 00:43:49.650353 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650362 | orchestrator | 2026-01-06 00:43:49.650370 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-01-06 00:43:49.650379 | orchestrator | Tuesday 06 January 2026 00:43:46 +0000 (0:00:00.159) 0:01:12.579 ******* 2026-01-06 00:43:49.650388 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650396 | orchestrator | 2026-01-06 00:43:49.650405 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-01-06 00:43:49.650414 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.153) 0:01:12.732 ******* 2026-01-06 00:43:49.650422 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650450 | orchestrator | 2026-01-06 00:43:49.650459 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-01-06 00:43:49.650468 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.146) 0:01:12.879 ******* 2026-01-06 00:43:49.650477 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650485 | orchestrator | 2026-01-06 00:43:49.650494 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-01-06 00:43:49.650503 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.160) 0:01:13.040 ******* 2026-01-06 00:43:49.650512 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650520 | orchestrator | 2026-01-06 00:43:49.650529 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-01-06 00:43:49.650538 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.151) 0:01:13.191 ******* 2026-01-06 00:43:49.650547 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650555 | orchestrator | 2026-01-06 00:43:49.650564 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-01-06 00:43:49.650573 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.148) 0:01:13.340 ******* 2026-01-06 00:43:49.650582 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650590 | orchestrator | 2026-01-06 00:43:49.650599 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-01-06 00:43:49.650608 | orchestrator | Tuesday 06 January 2026 00:43:47 +0000 (0:00:00.164) 0:01:13.505 ******* 2026-01-06 00:43:49.650617 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650625 | orchestrator | 2026-01-06 00:43:49.650634 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-01-06 00:43:49.650643 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.349) 0:01:13.855 ******* 2026-01-06 00:43:49.650651 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650660 | orchestrator | 2026-01-06 00:43:49.650673 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-01-06 00:43:49.650682 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.147) 0:01:14.003 ******* 2026-01-06 00:43:49.650691 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650700 | orchestrator | 2026-01-06 00:43:49.650709 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-01-06 00:43:49.650723 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.146) 0:01:14.149 ******* 2026-01-06 00:43:49.650732 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650741 | orchestrator | 2026-01-06 00:43:49.650750 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-01-06 00:43:49.650759 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.146) 0:01:14.295 ******* 2026-01-06 00:43:49.650768 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650777 | orchestrator | 2026-01-06 00:43:49.650786 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-01-06 00:43:49.650794 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.175) 0:01:14.471 ******* 2026-01-06 00:43:49.650803 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650812 | orchestrator | 2026-01-06 00:43:49.650820 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-01-06 00:43:49.650829 | orchestrator | Tuesday 06 January 2026 00:43:48 +0000 (0:00:00.142) 0:01:14.613 ******* 2026-01-06 00:43:49.650838 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650847 | orchestrator | 2026-01-06 00:43:49.650855 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-01-06 00:43:49.650864 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.139) 0:01:14.753 ******* 2026-01-06 00:43:49.650873 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:49.650882 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:49.650891 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650900 | orchestrator | 2026-01-06 00:43:49.650908 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-01-06 00:43:49.650917 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.168) 0:01:14.921 ******* 2026-01-06 00:43:49.650926 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:49.650935 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:49.650944 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:49.650952 | orchestrator | 2026-01-06 00:43:49.650961 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-01-06 00:43:49.650970 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.193) 0:01:15.115 ******* 2026-01-06 00:43:49.650985 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921155 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921272 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921291 | orchestrator | 2026-01-06 00:43:52.921306 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-01-06 00:43:52.921319 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.171) 0:01:15.287 ******* 2026-01-06 00:43:52.921332 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921345 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921358 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921370 | orchestrator | 2026-01-06 00:43:52.921383 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-01-06 00:43:52.921422 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.161) 0:01:15.448 ******* 2026-01-06 00:43:52.921493 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921506 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921518 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921530 | orchestrator | 2026-01-06 00:43:52.921542 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-01-06 00:43:52.921554 | orchestrator | Tuesday 06 January 2026 00:43:49 +0000 (0:00:00.162) 0:01:15.611 ******* 2026-01-06 00:43:52.921566 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921596 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921610 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921622 | orchestrator | 2026-01-06 00:43:52.921634 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-01-06 00:43:52.921645 | orchestrator | Tuesday 06 January 2026 00:43:50 +0000 (0:00:00.449) 0:01:16.061 ******* 2026-01-06 00:43:52.921657 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921669 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921682 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921695 | orchestrator | 2026-01-06 00:43:52.921708 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-01-06 00:43:52.921720 | orchestrator | Tuesday 06 January 2026 00:43:50 +0000 (0:00:00.167) 0:01:16.228 ******* 2026-01-06 00:43:52.921733 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.921745 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.921757 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.921769 | orchestrator | 2026-01-06 00:43:52.921781 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-01-06 00:43:52.921793 | orchestrator | Tuesday 06 January 2026 00:43:50 +0000 (0:00:00.159) 0:01:16.388 ******* 2026-01-06 00:43:52.921805 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:52.921818 | orchestrator | 2026-01-06 00:43:52.921832 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-01-06 00:43:52.921844 | orchestrator | Tuesday 06 January 2026 00:43:51 +0000 (0:00:00.534) 0:01:16.922 ******* 2026-01-06 00:43:52.921881 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:52.921905 | orchestrator | 2026-01-06 00:43:52.921917 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-01-06 00:43:52.921930 | orchestrator | Tuesday 06 January 2026 00:43:51 +0000 (0:00:00.530) 0:01:17.452 ******* 2026-01-06 00:43:52.921943 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:43:52.921955 | orchestrator | 2026-01-06 00:43:52.921968 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-01-06 00:43:52.921980 | orchestrator | Tuesday 06 January 2026 00:43:51 +0000 (0:00:00.151) 0:01:17.604 ******* 2026-01-06 00:43:52.921993 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'vg_name': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'}) 2026-01-06 00:43:52.922007 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'vg_name': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'}) 2026-01-06 00:43:52.922085 | orchestrator | 2026-01-06 00:43:52.922099 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-01-06 00:43:52.922111 | orchestrator | Tuesday 06 January 2026 00:43:52 +0000 (0:00:00.265) 0:01:17.869 ******* 2026-01-06 00:43:52.922144 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.922156 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.922168 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.922180 | orchestrator | 2026-01-06 00:43:52.922192 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-01-06 00:43:52.922204 | orchestrator | Tuesday 06 January 2026 00:43:52 +0000 (0:00:00.175) 0:01:18.045 ******* 2026-01-06 00:43:52.922217 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.922229 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.922242 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.922254 | orchestrator | 2026-01-06 00:43:52.922266 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-01-06 00:43:52.922278 | orchestrator | Tuesday 06 January 2026 00:43:52 +0000 (0:00:00.165) 0:01:18.211 ******* 2026-01-06 00:43:52.922290 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'})  2026-01-06 00:43:52.922302 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'})  2026-01-06 00:43:52.922314 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:43:52.922326 | orchestrator | 2026-01-06 00:43:52.922338 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-01-06 00:43:52.922349 | orchestrator | Tuesday 06 January 2026 00:43:52 +0000 (0:00:00.154) 0:01:18.365 ******* 2026-01-06 00:43:52.922361 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 00:43:52.922372 | orchestrator |  "lvm_report": { 2026-01-06 00:43:52.922385 | orchestrator |  "lv": [ 2026-01-06 00:43:52.922397 | orchestrator |  { 2026-01-06 00:43:52.922416 | orchestrator |  "lv_name": "osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9", 2026-01-06 00:43:52.922458 | orchestrator |  "vg_name": "ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9" 2026-01-06 00:43:52.922471 | orchestrator |  }, 2026-01-06 00:43:52.922484 | orchestrator |  { 2026-01-06 00:43:52.922495 | orchestrator |  "lv_name": "osd-block-588df21e-a0c0-57e7-8c43-2f77be274309", 2026-01-06 00:43:52.922508 | orchestrator |  "vg_name": "ceph-588df21e-a0c0-57e7-8c43-2f77be274309" 2026-01-06 00:43:52.922520 | orchestrator |  } 2026-01-06 00:43:52.922532 | orchestrator |  ], 2026-01-06 00:43:52.922543 | orchestrator |  "pv": [ 2026-01-06 00:43:52.922568 | orchestrator |  { 2026-01-06 00:43:52.922592 | orchestrator |  "pv_name": "/dev/sdb", 2026-01-06 00:43:52.922603 | orchestrator |  "vg_name": "ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9" 2026-01-06 00:43:52.922614 | orchestrator |  }, 2026-01-06 00:43:52.922625 | orchestrator |  { 2026-01-06 00:43:52.922636 | orchestrator |  "pv_name": "/dev/sdc", 2026-01-06 00:43:52.922647 | orchestrator |  "vg_name": "ceph-588df21e-a0c0-57e7-8c43-2f77be274309" 2026-01-06 00:43:52.922658 | orchestrator |  } 2026-01-06 00:43:52.922669 | orchestrator |  ] 2026-01-06 00:43:52.922690 | orchestrator |  } 2026-01-06 00:43:52.922702 | orchestrator | } 2026-01-06 00:43:52.922713 | orchestrator | 2026-01-06 00:43:52.922724 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:43:52.922735 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-01-06 00:43:52.922746 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-01-06 00:43:52.922757 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-01-06 00:43:52.922769 | orchestrator | 2026-01-06 00:43:52.922781 | orchestrator | 2026-01-06 00:43:52.922792 | orchestrator | 2026-01-06 00:43:52.922803 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:43:52.922814 | orchestrator | Tuesday 06 January 2026 00:43:52 +0000 (0:00:00.162) 0:01:18.527 ******* 2026-01-06 00:43:52.922825 | orchestrator | =============================================================================== 2026-01-06 00:43:52.922836 | orchestrator | Create block VGs -------------------------------------------------------- 5.89s 2026-01-06 00:43:52.922847 | orchestrator | Create block LVs -------------------------------------------------------- 4.38s 2026-01-06 00:43:52.922859 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.99s 2026-01-06 00:43:52.922870 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.90s 2026-01-06 00:43:52.922881 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.71s 2026-01-06 00:43:52.922892 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.64s 2026-01-06 00:43:52.922903 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.64s 2026-01-06 00:43:52.922914 | orchestrator | Add known partitions to the list of available block devices ------------- 1.55s 2026-01-06 00:43:52.922934 | orchestrator | Add known links to the list of available block devices ------------------ 1.32s 2026-01-06 00:43:53.468712 | orchestrator | Add known partitions to the list of available block devices ------------- 1.11s 2026-01-06 00:43:53.468860 | orchestrator | Print LVM report data --------------------------------------------------- 0.96s 2026-01-06 00:43:53.468886 | orchestrator | Print size needed for LVs on ceph_db_devices ---------------------------- 0.89s 2026-01-06 00:43:53.468907 | orchestrator | Add known partitions to the list of available block devices ------------- 0.89s 2026-01-06 00:43:53.468927 | orchestrator | Add known links to the list of available block devices ------------------ 0.87s 2026-01-06 00:43:53.468945 | orchestrator | Print 'Create DB VGs' --------------------------------------------------- 0.83s 2026-01-06 00:43:53.468963 | orchestrator | Create DB LVs for ceph_db_devices --------------------------------------- 0.83s 2026-01-06 00:43:53.468982 | orchestrator | Print number of OSDs wanted per DB+WAL VG ------------------------------- 0.83s 2026-01-06 00:43:53.469001 | orchestrator | Print 'Create WAL LVs for ceph_db_wal_devices' -------------------------- 0.83s 2026-01-06 00:43:53.469019 | orchestrator | Add known partitions to the list of available block devices ------------- 0.82s 2026-01-06 00:43:53.469038 | orchestrator | Fail if block LV defined in lvm_volumes is missing ---------------------- 0.82s 2026-01-06 00:44:06.043072 | orchestrator | 2026-01-06 00:44:06 | INFO  | Task df0a449e-59e8-4994-9512-3f60ac455b56 (facts) was prepared for execution. 2026-01-06 00:44:06.043220 | orchestrator | 2026-01-06 00:44:06 | INFO  | It takes a moment until task df0a449e-59e8-4994-9512-3f60ac455b56 (facts) has been started and output is visible here. 2026-01-06 00:44:18.811379 | orchestrator | 2026-01-06 00:44:18.811535 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-01-06 00:44:18.811550 | orchestrator | 2026-01-06 00:44:18.811559 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-01-06 00:44:18.811568 | orchestrator | Tuesday 06 January 2026 00:44:10 +0000 (0:00:00.261) 0:00:00.261 ******* 2026-01-06 00:44:18.811601 | orchestrator | ok: [testbed-manager] 2026-01-06 00:44:18.811611 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:44:18.811619 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:44:18.811627 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:44:18.811635 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:44:18.811643 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:44:18.811651 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:44:18.811658 | orchestrator | 2026-01-06 00:44:18.811667 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-01-06 00:44:18.811676 | orchestrator | Tuesday 06 January 2026 00:44:11 +0000 (0:00:01.184) 0:00:01.446 ******* 2026-01-06 00:44:18.811684 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:44:18.811692 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:44:18.811701 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:44:18.811708 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:44:18.811716 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:44:18.811724 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:44:18.811732 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:44:18.811740 | orchestrator | 2026-01-06 00:44:18.811748 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-01-06 00:44:18.811756 | orchestrator | 2026-01-06 00:44:18.811764 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-01-06 00:44:18.811772 | orchestrator | Tuesday 06 January 2026 00:44:12 +0000 (0:00:01.341) 0:00:02.788 ******* 2026-01-06 00:44:18.811780 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:44:18.811788 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:44:18.811796 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:44:18.811804 | orchestrator | ok: [testbed-manager] 2026-01-06 00:44:18.811811 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:44:18.811819 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:44:18.811827 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:44:18.811835 | orchestrator | 2026-01-06 00:44:18.811843 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-01-06 00:44:18.811851 | orchestrator | 2026-01-06 00:44:18.811859 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-01-06 00:44:18.811867 | orchestrator | Tuesday 06 January 2026 00:44:17 +0000 (0:00:05.071) 0:00:07.859 ******* 2026-01-06 00:44:18.811875 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:44:18.811883 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:44:18.811891 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:44:18.811899 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:44:18.811907 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:44:18.811915 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:44:18.811924 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:44:18.811933 | orchestrator | 2026-01-06 00:44:18.811942 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:44:18.811952 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.811963 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.811972 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.811981 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.811990 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.811999 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.812015 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:18.812025 | orchestrator | 2026-01-06 00:44:18.812034 | orchestrator | 2026-01-06 00:44:18.812043 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:44:18.812053 | orchestrator | Tuesday 06 January 2026 00:44:18 +0000 (0:00:00.527) 0:00:08.386 ******* 2026-01-06 00:44:18.812062 | orchestrator | =============================================================================== 2026-01-06 00:44:18.812071 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.07s 2026-01-06 00:44:18.812081 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.34s 2026-01-06 00:44:18.812091 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.18s 2026-01-06 00:44:18.812100 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.53s 2026-01-06 00:44:31.299984 | orchestrator | 2026-01-06 00:44:31 | INFO  | Task a12429fd-445c-4630-abce-6f6b13ac7222 (frr) was prepared for execution. 2026-01-06 00:44:31.300115 | orchestrator | 2026-01-06 00:44:31 | INFO  | It takes a moment until task a12429fd-445c-4630-abce-6f6b13ac7222 (frr) has been started and output is visible here. 2026-01-06 00:44:58.774634 | orchestrator | 2026-01-06 00:44:58.774765 | orchestrator | PLAY [Apply role frr] ********************************************************** 2026-01-06 00:44:58.774783 | orchestrator | 2026-01-06 00:44:58.774795 | orchestrator | TASK [osism.services.frr : Include distribution specific install tasks] ******** 2026-01-06 00:44:58.774831 | orchestrator | Tuesday 06 January 2026 00:44:35 +0000 (0:00:00.224) 0:00:00.224 ******* 2026-01-06 00:44:58.774844 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/frr/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:44:58.774856 | orchestrator | 2026-01-06 00:44:58.774868 | orchestrator | TASK [osism.services.frr : Pin frr package version] **************************** 2026-01-06 00:44:58.774879 | orchestrator | Tuesday 06 January 2026 00:44:35 +0000 (0:00:00.220) 0:00:00.444 ******* 2026-01-06 00:44:58.774890 | orchestrator | changed: [testbed-manager] 2026-01-06 00:44:58.774902 | orchestrator | 2026-01-06 00:44:58.774913 | orchestrator | TASK [osism.services.frr : Install frr package] ******************************** 2026-01-06 00:44:58.774932 | orchestrator | Tuesday 06 January 2026 00:44:36 +0000 (0:00:01.184) 0:00:01.629 ******* 2026-01-06 00:44:58.774944 | orchestrator | changed: [testbed-manager] 2026-01-06 00:44:58.774955 | orchestrator | 2026-01-06 00:44:58.774966 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/vtysh.conf] ********************* 2026-01-06 00:44:58.774977 | orchestrator | Tuesday 06 January 2026 00:44:47 +0000 (0:00:10.494) 0:00:12.123 ******* 2026-01-06 00:44:58.774988 | orchestrator | ok: [testbed-manager] 2026-01-06 00:44:58.775000 | orchestrator | 2026-01-06 00:44:58.775011 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/daemons] ************************ 2026-01-06 00:44:58.775022 | orchestrator | Tuesday 06 January 2026 00:44:48 +0000 (0:00:01.125) 0:00:13.249 ******* 2026-01-06 00:44:58.775033 | orchestrator | changed: [testbed-manager] 2026-01-06 00:44:58.775044 | orchestrator | 2026-01-06 00:44:58.775055 | orchestrator | TASK [osism.services.frr : Set _frr_uplinks fact] ****************************** 2026-01-06 00:44:58.775066 | orchestrator | Tuesday 06 January 2026 00:44:49 +0000 (0:00:00.980) 0:00:14.229 ******* 2026-01-06 00:44:58.775077 | orchestrator | ok: [testbed-manager] 2026-01-06 00:44:58.775088 | orchestrator | 2026-01-06 00:44:58.775099 | orchestrator | TASK [osism.services.frr : Check for frr.conf file in the configuration repository] *** 2026-01-06 00:44:58.775111 | orchestrator | Tuesday 06 January 2026 00:44:50 +0000 (0:00:01.234) 0:00:15.464 ******* 2026-01-06 00:44:58.775121 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:44:58.775133 | orchestrator | 2026-01-06 00:44:58.775144 | orchestrator | TASK [osism.services.frr : Copy frr.conf file from the configuration repository] *** 2026-01-06 00:44:58.775158 | orchestrator | Tuesday 06 January 2026 00:44:50 +0000 (0:00:00.127) 0:00:15.591 ******* 2026-01-06 00:44:58.775197 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:44:58.775211 | orchestrator | 2026-01-06 00:44:58.775223 | orchestrator | TASK [osism.services.frr : Copy default frr.conf file of type k3s_cilium] ****** 2026-01-06 00:44:58.775235 | orchestrator | Tuesday 06 January 2026 00:44:51 +0000 (0:00:00.186) 0:00:15.778 ******* 2026-01-06 00:44:58.775248 | orchestrator | changed: [testbed-manager] 2026-01-06 00:44:58.775261 | orchestrator | 2026-01-06 00:44:58.775273 | orchestrator | TASK [osism.services.frr : Set sysctl parameters] ****************************** 2026-01-06 00:44:58.775286 | orchestrator | Tuesday 06 January 2026 00:44:52 +0000 (0:00:01.011) 0:00:16.790 ******* 2026-01-06 00:44:58.775299 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.ip_forward', 'value': 1}) 2026-01-06 00:44:58.775311 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.send_redirects', 'value': 0}) 2026-01-06 00:44:58.775325 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.accept_redirects', 'value': 0}) 2026-01-06 00:44:58.775339 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.fib_multipath_hash_policy', 'value': 1}) 2026-01-06 00:44:58.775351 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.default.ignore_routes_with_linkdown', 'value': 1}) 2026-01-06 00:44:58.775365 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.rp_filter', 'value': 2}) 2026-01-06 00:44:58.775488 | orchestrator | 2026-01-06 00:44:58.775502 | orchestrator | TASK [osism.services.frr : Manage frr service] ********************************* 2026-01-06 00:44:58.775516 | orchestrator | Tuesday 06 January 2026 00:44:55 +0000 (0:00:03.304) 0:00:20.094 ******* 2026-01-06 00:44:58.775528 | orchestrator | ok: [testbed-manager] 2026-01-06 00:44:58.775539 | orchestrator | 2026-01-06 00:44:58.775550 | orchestrator | RUNNING HANDLER [osism.services.frr : Restart frr service] ********************* 2026-01-06 00:44:58.775561 | orchestrator | Tuesday 06 January 2026 00:44:56 +0000 (0:00:01.573) 0:00:21.668 ******* 2026-01-06 00:44:58.775572 | orchestrator | changed: [testbed-manager] 2026-01-06 00:44:58.775583 | orchestrator | 2026-01-06 00:44:58.775594 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:44:58.775605 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 00:44:58.775616 | orchestrator | 2026-01-06 00:44:58.775627 | orchestrator | 2026-01-06 00:44:58.775638 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:44:58.775649 | orchestrator | Tuesday 06 January 2026 00:44:58 +0000 (0:00:01.485) 0:00:23.153 ******* 2026-01-06 00:44:58.775660 | orchestrator | =============================================================================== 2026-01-06 00:44:58.775671 | orchestrator | osism.services.frr : Install frr package ------------------------------- 10.49s 2026-01-06 00:44:58.775681 | orchestrator | osism.services.frr : Set sysctl parameters ------------------------------ 3.30s 2026-01-06 00:44:58.775692 | orchestrator | osism.services.frr : Manage frr service --------------------------------- 1.57s 2026-01-06 00:44:58.775703 | orchestrator | osism.services.frr : Restart frr service -------------------------------- 1.49s 2026-01-06 00:44:58.775714 | orchestrator | osism.services.frr : Set _frr_uplinks fact ------------------------------ 1.23s 2026-01-06 00:44:58.775745 | orchestrator | osism.services.frr : Pin frr package version ---------------------------- 1.18s 2026-01-06 00:44:58.775756 | orchestrator | osism.services.frr : Copy file: /etc/frr/vtysh.conf --------------------- 1.13s 2026-01-06 00:44:58.775767 | orchestrator | osism.services.frr : Copy default frr.conf file of type k3s_cilium ------ 1.01s 2026-01-06 00:44:58.775778 | orchestrator | osism.services.frr : Copy file: /etc/frr/daemons ------------------------ 0.98s 2026-01-06 00:44:58.775789 | orchestrator | osism.services.frr : Include distribution specific install tasks -------- 0.22s 2026-01-06 00:44:58.775800 | orchestrator | osism.services.frr : Copy frr.conf file from the configuration repository --- 0.19s 2026-01-06 00:44:58.775811 | orchestrator | osism.services.frr : Check for frr.conf file in the configuration repository --- 0.13s 2026-01-06 00:44:59.094220 | orchestrator | 2026-01-06 00:44:59.098226 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Tue Jan 6 00:44:59 UTC 2026 2026-01-06 00:44:59.098304 | orchestrator | 2026-01-06 00:45:01.152946 | orchestrator | 2026-01-06 00:45:01 | INFO  | Collection nutshell is prepared for execution 2026-01-06 00:45:01.153068 | orchestrator | 2026-01-06 00:45:01 | INFO  | A [0] - dotfiles 2026-01-06 00:45:11.172084 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - homer 2026-01-06 00:45:11.172208 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - netdata 2026-01-06 00:45:11.172226 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - openstackclient 2026-01-06 00:45:11.172239 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - phpmyadmin 2026-01-06 00:45:11.172251 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - common 2026-01-06 00:45:11.176102 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- loadbalancer 2026-01-06 00:45:11.176162 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [2] --- opensearch 2026-01-06 00:45:11.176183 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [2] --- mariadb-ng 2026-01-06 00:45:11.176201 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [3] ---- horizon 2026-01-06 00:45:11.176231 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [3] ---- keystone 2026-01-06 00:45:11.176776 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- neutron 2026-01-06 00:45:11.176843 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ wait-for-nova 2026-01-06 00:45:11.176851 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [6] ------- octavia 2026-01-06 00:45:11.178484 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- barbican 2026-01-06 00:45:11.178520 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- designate 2026-01-06 00:45:11.178922 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- ironic 2026-01-06 00:45:11.179139 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- placement 2026-01-06 00:45:11.179159 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- magnum 2026-01-06 00:45:11.179472 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- openvswitch 2026-01-06 00:45:11.179491 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [2] --- ovn 2026-01-06 00:45:11.179917 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- memcached 2026-01-06 00:45:11.179942 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- redis 2026-01-06 00:45:11.180122 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- rabbitmq-ng 2026-01-06 00:45:11.180386 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - kubernetes 2026-01-06 00:45:11.183012 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- kubeconfig 2026-01-06 00:45:11.183036 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- copy-kubeconfig 2026-01-06 00:45:11.183291 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [0] - ceph 2026-01-06 00:45:11.185593 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [1] -- ceph-pools 2026-01-06 00:45:11.185603 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [2] --- copy-ceph-keys 2026-01-06 00:45:11.185609 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [3] ---- cephclient 2026-01-06 00:45:11.185750 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- ceph-bootstrap-dashboard 2026-01-06 00:45:11.185758 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- wait-for-keystone 2026-01-06 00:45:11.185801 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ kolla-ceph-rgw 2026-01-06 00:45:11.186103 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ glance 2026-01-06 00:45:11.186188 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ cinder 2026-01-06 00:45:11.186200 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ nova 2026-01-06 00:45:11.186518 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [4] ----- prometheus 2026-01-06 00:45:11.186612 | orchestrator | 2026-01-06 00:45:11 | INFO  | A [5] ------ grafana 2026-01-06 00:45:11.411430 | orchestrator | 2026-01-06 00:45:11 | INFO  | All tasks of the collection nutshell are prepared for execution 2026-01-06 00:45:11.411561 | orchestrator | 2026-01-06 00:45:11 | INFO  | Tasks are running in the background 2026-01-06 00:45:14.961782 | orchestrator | 2026-01-06 00:45:14 | INFO  | No task IDs specified, wait for all currently running tasks 2026-01-06 00:45:17.095636 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:17.096126 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:17.096737 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:17.097450 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:17.102657 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:17.103090 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:17.103892 | orchestrator | 2026-01-06 00:45:17 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:17.103993 | orchestrator | 2026-01-06 00:45:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:20.170089 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:20.170200 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:20.170216 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:20.170229 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:20.170240 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:20.170252 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:20.170263 | orchestrator | 2026-01-06 00:45:20 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:20.170274 | orchestrator | 2026-01-06 00:45:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:23.206438 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:23.206557 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:23.206959 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:23.207678 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:23.208148 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:23.211057 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:23.213635 | orchestrator | 2026-01-06 00:45:23 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:23.213740 | orchestrator | 2026-01-06 00:45:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:26.305618 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:26.305701 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:26.305709 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:26.305716 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:26.305722 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:26.305727 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:26.305733 | orchestrator | 2026-01-06 00:45:26 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:26.305739 | orchestrator | 2026-01-06 00:45:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:29.403672 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:29.403790 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:29.403807 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:29.403820 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:29.403832 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:29.403843 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:29.403854 | orchestrator | 2026-01-06 00:45:29 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:29.403866 | orchestrator | 2026-01-06 00:45:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:32.558830 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:32.558983 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:32.561430 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:32.562003 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:32.563555 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:32.564165 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:32.565829 | orchestrator | 2026-01-06 00:45:32 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:32.565885 | orchestrator | 2026-01-06 00:45:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:35.815493 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:35.815643 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:35.815669 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:35.815734 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:35.815756 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:35.815777 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:35.815797 | orchestrator | 2026-01-06 00:45:35 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:35.815818 | orchestrator | 2026-01-06 00:45:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:38.848339 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:38.848530 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:38.848546 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:38.848558 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:38.848569 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:38.848580 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:38.848591 | orchestrator | 2026-01-06 00:45:38 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state STARTED 2026-01-06 00:45:38.848602 | orchestrator | 2026-01-06 00:45:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:41.895213 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:41.895692 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:41.899419 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:41.899503 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:41.899516 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:41.902275 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:41.906528 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:41.907528 | orchestrator | 2026-01-06 00:45:41 | INFO  | Task 5e508601-1500-42f7-b04e-f4baca45c9e6 is in state SUCCESS 2026-01-06 00:45:41.907579 | orchestrator | 2026-01-06 00:45:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:41.907817 | orchestrator | 2026-01-06 00:45:41.907836 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2026-01-06 00:45:41.907846 | orchestrator | 2026-01-06 00:45:41.907855 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2026-01-06 00:45:41.907865 | orchestrator | Tuesday 06 January 2026 00:45:25 +0000 (0:00:01.056) 0:00:01.056 ******* 2026-01-06 00:45:41.907874 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:45:41.907886 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:45:41.907895 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:45:41.907904 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:45:41.907913 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:45:41.907922 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:45:41.907958 | orchestrator | changed: [testbed-manager] 2026-01-06 00:45:41.907968 | orchestrator | 2026-01-06 00:45:41.907976 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2026-01-06 00:45:41.907985 | orchestrator | Tuesday 06 January 2026 00:45:30 +0000 (0:00:04.371) 0:00:05.428 ******* 2026-01-06 00:45:41.907994 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-01-06 00:45:41.908004 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-01-06 00:45:41.908013 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-01-06 00:45:41.908021 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-01-06 00:45:41.908030 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-01-06 00:45:41.908039 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-01-06 00:45:41.908047 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-01-06 00:45:41.908056 | orchestrator | 2026-01-06 00:45:41.908064 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2026-01-06 00:45:41.908073 | orchestrator | Tuesday 06 January 2026 00:45:32 +0000 (0:00:01.893) 0:00:07.321 ******* 2026-01-06 00:45:41.908085 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.855619', 'end': '2026-01-06 00:45:31.323109', 'delta': '0:00:00.467490', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908105 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.990925', 'end': '2026-01-06 00:45:30.999706', 'delta': '0:00:00.008781', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908411 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.944899', 'end': '2026-01-06 00:45:30.953164', 'delta': '0:00:00.008265', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908448 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.927204', 'end': '2026-01-06 00:45:30.932078', 'delta': '0:00:00.004874', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908475 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.997987', 'end': '2026-01-06 00:45:31.007853', 'delta': '0:00:00.009866', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908491 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:30.927220', 'end': '2026-01-06 00:45:30.938944', 'delta': '0:00:00.011724', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908502 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-01-06 00:45:31.174134', 'end': '2026-01-06 00:45:31.183634', 'delta': '0:00:00.009500', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-01-06 00:45:41.908513 | orchestrator | 2026-01-06 00:45:41.908524 | orchestrator | TASK [geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist.] **** 2026-01-06 00:45:41.908535 | orchestrator | Tuesday 06 January 2026 00:45:33 +0000 (0:00:01.924) 0:00:09.246 ******* 2026-01-06 00:45:41.908546 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-01-06 00:45:41.908556 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-01-06 00:45:41.908566 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-01-06 00:45:41.908576 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-01-06 00:45:41.908586 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-01-06 00:45:41.908596 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-01-06 00:45:41.908606 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-01-06 00:45:41.908616 | orchestrator | 2026-01-06 00:45:41.908626 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2026-01-06 00:45:41.908637 | orchestrator | Tuesday 06 January 2026 00:45:36 +0000 (0:00:02.148) 0:00:11.395 ******* 2026-01-06 00:45:41.908658 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2026-01-06 00:45:41.908669 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2026-01-06 00:45:41.908679 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2026-01-06 00:45:41.908689 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2026-01-06 00:45:41.908699 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2026-01-06 00:45:41.908710 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2026-01-06 00:45:41.908721 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2026-01-06 00:45:41.908731 | orchestrator | 2026-01-06 00:45:41.908741 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:45:41.908759 | orchestrator | testbed-manager : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908772 | orchestrator | testbed-node-0 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908781 | orchestrator | testbed-node-1 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908789 | orchestrator | testbed-node-2 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908799 | orchestrator | testbed-node-3 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908807 | orchestrator | testbed-node-4 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908816 | orchestrator | testbed-node-5 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:45:41.908825 | orchestrator | 2026-01-06 00:45:41.908834 | orchestrator | 2026-01-06 00:45:41.908842 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:45:41.908851 | orchestrator | Tuesday 06 January 2026 00:45:39 +0000 (0:00:03.439) 0:00:14.834 ******* 2026-01-06 00:45:41.908860 | orchestrator | =============================================================================== 2026-01-06 00:45:41.908869 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 4.37s 2026-01-06 00:45:41.908877 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 3.44s 2026-01-06 00:45:41.908886 | orchestrator | geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist. ---- 2.15s 2026-01-06 00:45:41.908899 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 1.92s 2026-01-06 00:45:41.908909 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 1.89s 2026-01-06 00:45:44.986807 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:44.986920 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:44.987339 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:44.989633 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:44.991432 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:44.993825 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:44.994154 | orchestrator | 2026-01-06 00:45:44 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:44.994181 | orchestrator | 2026-01-06 00:45:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:48.054320 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:48.054503 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:48.054522 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:48.054535 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:48.054546 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:48.054557 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:48.054568 | orchestrator | 2026-01-06 00:45:48 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:48.054579 | orchestrator | 2026-01-06 00:45:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:51.095337 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:51.095521 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:51.097573 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:51.097962 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:51.099322 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:51.100581 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:51.102139 | orchestrator | 2026-01-06 00:45:51 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:51.102168 | orchestrator | 2026-01-06 00:45:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:54.158933 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:54.159033 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:54.180773 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:54.180880 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:54.180897 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:54.180913 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:54.180927 | orchestrator | 2026-01-06 00:45:54 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:54.184338 | orchestrator | 2026-01-06 00:45:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:45:57.289281 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:45:57.290194 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:45:57.293091 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:45:57.295288 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:45:57.296591 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:45:57.300557 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:45:57.307478 | orchestrator | 2026-01-06 00:45:57 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:45:57.309266 | orchestrator | 2026-01-06 00:45:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:00.524982 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:00.525167 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:46:00.525201 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:00.525222 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:00.525243 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:00.525263 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:00.525282 | orchestrator | 2026-01-06 00:46:00 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state STARTED 2026-01-06 00:46:00.525301 | orchestrator | 2026-01-06 00:46:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:03.827590 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:03.827733 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:46:03.827762 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:03.827784 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:03.827804 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:03.827823 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:03.827842 | orchestrator | 2026-01-06 00:46:03 | INFO  | Task 652091f9-d3fd-4d03-88d9-b88a5a71e975 is in state SUCCESS 2026-01-06 00:46:03.827862 | orchestrator | 2026-01-06 00:46:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:06.757655 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:06.757768 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:46:06.757784 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:06.757797 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:06.759023 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:06.762322 | orchestrator | 2026-01-06 00:46:06 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:06.762375 | orchestrator | 2026-01-06 00:46:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:09.826262 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:09.826547 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:46:09.827832 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:09.828759 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:09.831462 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:09.833886 | orchestrator | 2026-01-06 00:46:09 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:09.833977 | orchestrator | 2026-01-06 00:46:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:12.900008 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:12.901891 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state STARTED 2026-01-06 00:46:12.902453 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:12.903478 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:12.909127 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:12.909190 | orchestrator | 2026-01-06 00:46:12 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:12.909204 | orchestrator | 2026-01-06 00:46:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:15.965911 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:15.966109 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task accfa1b4-591a-47e2-9bc8-7b4c77b55df8 is in state SUCCESS 2026-01-06 00:46:15.966562 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:15.968586 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:15.972774 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:15.972862 | orchestrator | 2026-01-06 00:46:15 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:15.972878 | orchestrator | 2026-01-06 00:46:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:19.023390 | orchestrator | 2026-01-06 00:46:19 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:19.025143 | orchestrator | 2026-01-06 00:46:19 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:19.027750 | orchestrator | 2026-01-06 00:46:19 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:19.029387 | orchestrator | 2026-01-06 00:46:19 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:19.031980 | orchestrator | 2026-01-06 00:46:19 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:19.032877 | orchestrator | 2026-01-06 00:46:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:22.084576 | orchestrator | 2026-01-06 00:46:22 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:22.086966 | orchestrator | 2026-01-06 00:46:22 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:22.088877 | orchestrator | 2026-01-06 00:46:22 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:22.091974 | orchestrator | 2026-01-06 00:46:22 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:22.092620 | orchestrator | 2026-01-06 00:46:22 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:22.094748 | orchestrator | 2026-01-06 00:46:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:25.156515 | orchestrator | 2026-01-06 00:46:25 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:25.156642 | orchestrator | 2026-01-06 00:46:25 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:25.157058 | orchestrator | 2026-01-06 00:46:25 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:25.162312 | orchestrator | 2026-01-06 00:46:25 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:25.162416 | orchestrator | 2026-01-06 00:46:25 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:25.162430 | orchestrator | 2026-01-06 00:46:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:28.224395 | orchestrator | 2026-01-06 00:46:28 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:28.224498 | orchestrator | 2026-01-06 00:46:28 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:28.224531 | orchestrator | 2026-01-06 00:46:28 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:28.227400 | orchestrator | 2026-01-06 00:46:28 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:28.227419 | orchestrator | 2026-01-06 00:46:28 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:28.227429 | orchestrator | 2026-01-06 00:46:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:31.335877 | orchestrator | 2026-01-06 00:46:31 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:31.335959 | orchestrator | 2026-01-06 00:46:31 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:31.339033 | orchestrator | 2026-01-06 00:46:31 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:31.348964 | orchestrator | 2026-01-06 00:46:31 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:31.363261 | orchestrator | 2026-01-06 00:46:31 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:31.363420 | orchestrator | 2026-01-06 00:46:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:35.208072 | orchestrator | 2026-01-06 00:46:35 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:35.211810 | orchestrator | 2026-01-06 00:46:35 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:35.211904 | orchestrator | 2026-01-06 00:46:35 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:35.212516 | orchestrator | 2026-01-06 00:46:35 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:35.218228 | orchestrator | 2026-01-06 00:46:35 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:35.218303 | orchestrator | 2026-01-06 00:46:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:38.276857 | orchestrator | 2026-01-06 00:46:38 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:38.280836 | orchestrator | 2026-01-06 00:46:38 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:38.283142 | orchestrator | 2026-01-06 00:46:38 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:38.284972 | orchestrator | 2026-01-06 00:46:38 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:38.287168 | orchestrator | 2026-01-06 00:46:38 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:38.287200 | orchestrator | 2026-01-06 00:46:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:41.393606 | orchestrator | 2026-01-06 00:46:41 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:41.393705 | orchestrator | 2026-01-06 00:46:41 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:41.394109 | orchestrator | 2026-01-06 00:46:41 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:41.394740 | orchestrator | 2026-01-06 00:46:41 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:41.395553 | orchestrator | 2026-01-06 00:46:41 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:41.395584 | orchestrator | 2026-01-06 00:46:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:44.443113 | orchestrator | 2026-01-06 00:46:44 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:44.444031 | orchestrator | 2026-01-06 00:46:44 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:44.446858 | orchestrator | 2026-01-06 00:46:44 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:44.448362 | orchestrator | 2026-01-06 00:46:44 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:44.449688 | orchestrator | 2026-01-06 00:46:44 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:44.449723 | orchestrator | 2026-01-06 00:46:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:47.486379 | orchestrator | 2026-01-06 00:46:47 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:47.490004 | orchestrator | 2026-01-06 00:46:47 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:47.490167 | orchestrator | 2026-01-06 00:46:47 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:47.490949 | orchestrator | 2026-01-06 00:46:47 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state STARTED 2026-01-06 00:46:47.493612 | orchestrator | 2026-01-06 00:46:47 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:47.493668 | orchestrator | 2026-01-06 00:46:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:50.558494 | orchestrator | 2026-01-06 00:46:50 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:50.558584 | orchestrator | 2026-01-06 00:46:50 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:50.558590 | orchestrator | 2026-01-06 00:46:50 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:50.559086 | orchestrator | 2026-01-06 00:46:50 | INFO  | Task 8e2e6c35-7aa6-4348-b794-09f236ff683b is in state SUCCESS 2026-01-06 00:46:50.560150 | orchestrator | 2026-01-06 00:46:50.560217 | orchestrator | 2026-01-06 00:46:50.560225 | orchestrator | PLAY [Apply role homer] ******************************************************** 2026-01-06 00:46:50.560230 | orchestrator | 2026-01-06 00:46:50.560235 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2026-01-06 00:46:50.560242 | orchestrator | Tuesday 06 January 2026 00:45:24 +0000 (0:00:00.598) 0:00:00.598 ******* 2026-01-06 00:46:50.560248 | orchestrator | ok: [testbed-manager] => { 2026-01-06 00:46:50.560257 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2026-01-06 00:46:50.560265 | orchestrator | } 2026-01-06 00:46:50.560271 | orchestrator | 2026-01-06 00:46:50.560277 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2026-01-06 00:46:50.560283 | orchestrator | Tuesday 06 January 2026 00:45:24 +0000 (0:00:00.459) 0:00:01.058 ******* 2026-01-06 00:46:50.560289 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560296 | orchestrator | 2026-01-06 00:46:50.560302 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2026-01-06 00:46:50.560341 | orchestrator | Tuesday 06 January 2026 00:45:26 +0000 (0:00:01.596) 0:00:02.654 ******* 2026-01-06 00:46:50.560348 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2026-01-06 00:46:50.560355 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2026-01-06 00:46:50.560361 | orchestrator | 2026-01-06 00:46:50.560367 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2026-01-06 00:46:50.560372 | orchestrator | Tuesday 06 January 2026 00:45:29 +0000 (0:00:02.834) 0:00:05.488 ******* 2026-01-06 00:46:50.560378 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560385 | orchestrator | 2026-01-06 00:46:50.560391 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2026-01-06 00:46:50.560397 | orchestrator | Tuesday 06 January 2026 00:45:32 +0000 (0:00:03.446) 0:00:08.935 ******* 2026-01-06 00:46:50.560403 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560409 | orchestrator | 2026-01-06 00:46:50.560415 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2026-01-06 00:46:50.560421 | orchestrator | Tuesday 06 January 2026 00:45:34 +0000 (0:00:01.720) 0:00:10.655 ******* 2026-01-06 00:46:50.560428 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2026-01-06 00:46:50.560435 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560440 | orchestrator | 2026-01-06 00:46:50.560444 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2026-01-06 00:46:50.560448 | orchestrator | Tuesday 06 January 2026 00:45:59 +0000 (0:00:25.320) 0:00:35.976 ******* 2026-01-06 00:46:50.560452 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560456 | orchestrator | 2026-01-06 00:46:50.560460 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:46:50.560465 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:50.560470 | orchestrator | 2026-01-06 00:46:50.560475 | orchestrator | 2026-01-06 00:46:50.560479 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:46:50.560483 | orchestrator | Tuesday 06 January 2026 00:46:02 +0000 (0:00:02.600) 0:00:38.576 ******* 2026-01-06 00:46:50.560487 | orchestrator | =============================================================================== 2026-01-06 00:46:50.560490 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 25.32s 2026-01-06 00:46:50.560494 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 3.45s 2026-01-06 00:46:50.560498 | orchestrator | osism.services.homer : Create required directories ---------------------- 2.83s 2026-01-06 00:46:50.560502 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.60s 2026-01-06 00:46:50.560505 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 1.72s 2026-01-06 00:46:50.560509 | orchestrator | osism.services.homer : Create traefik external network ------------------ 1.60s 2026-01-06 00:46:50.560520 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.46s 2026-01-06 00:46:50.560524 | orchestrator | 2026-01-06 00:46:50.560527 | orchestrator | 2026-01-06 00:46:50.560531 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2026-01-06 00:46:50.560535 | orchestrator | 2026-01-06 00:46:50.560539 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2026-01-06 00:46:50.560554 | orchestrator | Tuesday 06 January 2026 00:45:26 +0000 (0:00:00.708) 0:00:00.708 ******* 2026-01-06 00:46:50.560558 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2026-01-06 00:46:50.560563 | orchestrator | 2026-01-06 00:46:50.560567 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2026-01-06 00:46:50.560571 | orchestrator | Tuesday 06 January 2026 00:45:26 +0000 (0:00:00.601) 0:00:01.309 ******* 2026-01-06 00:46:50.560575 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2026-01-06 00:46:50.560578 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2026-01-06 00:46:50.560582 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2026-01-06 00:46:50.560586 | orchestrator | 2026-01-06 00:46:50.560590 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2026-01-06 00:46:50.560594 | orchestrator | Tuesday 06 January 2026 00:45:29 +0000 (0:00:02.246) 0:00:03.556 ******* 2026-01-06 00:46:50.560597 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560601 | orchestrator | 2026-01-06 00:46:50.560605 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2026-01-06 00:46:50.560609 | orchestrator | Tuesday 06 January 2026 00:45:32 +0000 (0:00:03.759) 0:00:07.315 ******* 2026-01-06 00:46:50.560623 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2026-01-06 00:46:50.560627 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560631 | orchestrator | 2026-01-06 00:46:50.560635 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2026-01-06 00:46:50.560639 | orchestrator | Tuesday 06 January 2026 00:46:05 +0000 (0:00:33.062) 0:00:40.377 ******* 2026-01-06 00:46:50.560643 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560646 | orchestrator | 2026-01-06 00:46:50.560650 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2026-01-06 00:46:50.560654 | orchestrator | Tuesday 06 January 2026 00:46:07 +0000 (0:00:01.460) 0:00:41.837 ******* 2026-01-06 00:46:50.560658 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560661 | orchestrator | 2026-01-06 00:46:50.560665 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2026-01-06 00:46:50.560669 | orchestrator | Tuesday 06 January 2026 00:46:08 +0000 (0:00:01.232) 0:00:43.070 ******* 2026-01-06 00:46:50.560673 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560677 | orchestrator | 2026-01-06 00:46:50.560680 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2026-01-06 00:46:50.560684 | orchestrator | Tuesday 06 January 2026 00:46:10 +0000 (0:00:01.856) 0:00:44.927 ******* 2026-01-06 00:46:50.560688 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560692 | orchestrator | 2026-01-06 00:46:50.560695 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2026-01-06 00:46:50.560699 | orchestrator | Tuesday 06 January 2026 00:46:11 +0000 (0:00:01.047) 0:00:45.974 ******* 2026-01-06 00:46:50.560703 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560707 | orchestrator | 2026-01-06 00:46:50.560711 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2026-01-06 00:46:50.560714 | orchestrator | Tuesday 06 January 2026 00:46:12 +0000 (0:00:00.680) 0:00:46.655 ******* 2026-01-06 00:46:50.560718 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560722 | orchestrator | 2026-01-06 00:46:50.560726 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:46:50.560734 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:50.560737 | orchestrator | 2026-01-06 00:46:50.560742 | orchestrator | 2026-01-06 00:46:50.560747 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:46:50.560751 | orchestrator | Tuesday 06 January 2026 00:46:12 +0000 (0:00:00.363) 0:00:47.019 ******* 2026-01-06 00:46:50.560756 | orchestrator | =============================================================================== 2026-01-06 00:46:50.560760 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 33.06s 2026-01-06 00:46:50.560765 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 3.76s 2026-01-06 00:46:50.560769 | orchestrator | osism.services.openstackclient : Create required directories ------------ 2.25s 2026-01-06 00:46:50.560774 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 1.86s 2026-01-06 00:46:50.560778 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 1.46s 2026-01-06 00:46:50.560783 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 1.23s 2026-01-06 00:46:50.560787 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 1.05s 2026-01-06 00:46:50.560791 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 0.68s 2026-01-06 00:46:50.560796 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.60s 2026-01-06 00:46:50.560800 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.36s 2026-01-06 00:46:50.560804 | orchestrator | 2026-01-06 00:46:50.560809 | orchestrator | 2026-01-06 00:46:50.560813 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2026-01-06 00:46:50.560818 | orchestrator | 2026-01-06 00:46:50.560822 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2026-01-06 00:46:50.560826 | orchestrator | Tuesday 06 January 2026 00:45:44 +0000 (0:00:00.254) 0:00:00.254 ******* 2026-01-06 00:46:50.560831 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560835 | orchestrator | 2026-01-06 00:46:50.560840 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2026-01-06 00:46:50.560844 | orchestrator | Tuesday 06 January 2026 00:45:45 +0000 (0:00:00.914) 0:00:01.168 ******* 2026-01-06 00:46:50.560849 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2026-01-06 00:46:50.560853 | orchestrator | 2026-01-06 00:46:50.560861 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2026-01-06 00:46:50.560865 | orchestrator | Tuesday 06 January 2026 00:45:46 +0000 (0:00:01.086) 0:00:02.255 ******* 2026-01-06 00:46:50.560870 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560874 | orchestrator | 2026-01-06 00:46:50.560878 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2026-01-06 00:46:50.560883 | orchestrator | Tuesday 06 January 2026 00:45:48 +0000 (0:00:01.541) 0:00:03.796 ******* 2026-01-06 00:46:50.560887 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2026-01-06 00:46:50.560892 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:50.560896 | orchestrator | 2026-01-06 00:46:50.560900 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2026-01-06 00:46:50.560905 | orchestrator | Tuesday 06 January 2026 00:46:39 +0000 (0:00:51.471) 0:00:55.267 ******* 2026-01-06 00:46:50.560909 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:50.560914 | orchestrator | 2026-01-06 00:46:50.560918 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:46:50.560922 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:50.560927 | orchestrator | 2026-01-06 00:46:50.560931 | orchestrator | 2026-01-06 00:46:50.560936 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:46:50.560946 | orchestrator | Tuesday 06 January 2026 00:46:47 +0000 (0:00:07.922) 0:01:03.190 ******* 2026-01-06 00:46:50.560951 | orchestrator | =============================================================================== 2026-01-06 00:46:50.560955 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 51.47s 2026-01-06 00:46:50.560960 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 7.92s 2026-01-06 00:46:50.560964 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 1.54s 2026-01-06 00:46:50.560969 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 1.09s 2026-01-06 00:46:50.560973 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 0.91s 2026-01-06 00:46:50.562144 | orchestrator | 2026-01-06 00:46:50 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state STARTED 2026-01-06 00:46:50.562171 | orchestrator | 2026-01-06 00:46:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:53.613586 | orchestrator | 2026-01-06 00:46:53 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:53.614514 | orchestrator | 2026-01-06 00:46:53 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:53.616638 | orchestrator | 2026-01-06 00:46:53 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:53.619645 | orchestrator | 2026-01-06 00:46:53.619695 | orchestrator | 2026-01-06 00:46:53.619703 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:46:53.619712 | orchestrator | 2026-01-06 00:46:53.619718 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:46:53.619726 | orchestrator | Tuesday 06 January 2026 00:45:27 +0000 (0:00:00.675) 0:00:00.675 ******* 2026-01-06 00:46:53.619733 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2026-01-06 00:46:53.619741 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2026-01-06 00:46:53.619748 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2026-01-06 00:46:53.619754 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2026-01-06 00:46:53.619761 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2026-01-06 00:46:53.619768 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2026-01-06 00:46:53.619775 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2026-01-06 00:46:53.619781 | orchestrator | 2026-01-06 00:46:53.619788 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2026-01-06 00:46:53.619794 | orchestrator | 2026-01-06 00:46:53.619801 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2026-01-06 00:46:53.619807 | orchestrator | Tuesday 06 January 2026 00:45:27 +0000 (0:00:00.854) 0:00:01.529 ******* 2026-01-06 00:46:53.619832 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:46:53.619842 | orchestrator | 2026-01-06 00:46:53.619849 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2026-01-06 00:46:53.619856 | orchestrator | Tuesday 06 January 2026 00:45:29 +0000 (0:00:01.143) 0:00:02.673 ******* 2026-01-06 00:46:53.619862 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:46:53.619870 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:46:53.619877 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:46:53.619883 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:46:53.619890 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:53.619896 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:46:53.619903 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:46:53.619909 | orchestrator | 2026-01-06 00:46:53.619915 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2026-01-06 00:46:53.619949 | orchestrator | Tuesday 06 January 2026 00:45:30 +0000 (0:00:01.978) 0:00:04.651 ******* 2026-01-06 00:46:53.619955 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:46:53.619962 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:46:53.619969 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:46:53.619975 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:46:53.619981 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:46:53.619989 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:53.620006 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:46:53.620016 | orchestrator | 2026-01-06 00:46:53.620025 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2026-01-06 00:46:53.620035 | orchestrator | Tuesday 06 January 2026 00:45:34 +0000 (0:00:03.424) 0:00:08.076 ******* 2026-01-06 00:46:53.620043 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620053 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:46:53.620062 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:46:53.620071 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:46:53.620080 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:46:53.620088 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:46:53.620094 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:46:53.620100 | orchestrator | 2026-01-06 00:46:53.620107 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2026-01-06 00:46:53.620113 | orchestrator | Tuesday 06 January 2026 00:45:37 +0000 (0:00:02.638) 0:00:10.714 ******* 2026-01-06 00:46:53.620119 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:46:53.620125 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:46:53.620131 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:46:53.620137 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:46:53.620144 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:46:53.620149 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:46:53.620155 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620161 | orchestrator | 2026-01-06 00:46:53.620167 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2026-01-06 00:46:53.620173 | orchestrator | Tuesday 06 January 2026 00:45:49 +0000 (0:00:11.978) 0:00:22.693 ******* 2026-01-06 00:46:53.620179 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:46:53.620184 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:46:53.620190 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:46:53.620196 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:46:53.620202 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:46:53.620208 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:46:53.620214 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620220 | orchestrator | 2026-01-06 00:46:53.620227 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2026-01-06 00:46:53.620234 | orchestrator | Tuesday 06 January 2026 00:46:29 +0000 (0:00:40.466) 0:01:03.160 ******* 2026-01-06 00:46:53.620242 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:46:53.620251 | orchestrator | 2026-01-06 00:46:53.620259 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2026-01-06 00:46:53.620266 | orchestrator | Tuesday 06 January 2026 00:46:31 +0000 (0:00:01.502) 0:01:04.663 ******* 2026-01-06 00:46:53.620273 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2026-01-06 00:46:53.620281 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2026-01-06 00:46:53.620288 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2026-01-06 00:46:53.620296 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2026-01-06 00:46:53.620345 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2026-01-06 00:46:53.620353 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2026-01-06 00:46:53.620360 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2026-01-06 00:46:53.620367 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2026-01-06 00:46:53.620382 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2026-01-06 00:46:53.620389 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2026-01-06 00:46:53.620395 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2026-01-06 00:46:53.620402 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2026-01-06 00:46:53.620409 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2026-01-06 00:46:53.620416 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2026-01-06 00:46:53.620423 | orchestrator | 2026-01-06 00:46:53.620429 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2026-01-06 00:46:53.620438 | orchestrator | Tuesday 06 January 2026 00:46:37 +0000 (0:00:06.246) 0:01:10.909 ******* 2026-01-06 00:46:53.620445 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:53.620451 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:46:53.620457 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:46:53.620462 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:46:53.620468 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:46:53.620473 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:46:53.620479 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:46:53.620485 | orchestrator | 2026-01-06 00:46:53.620490 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2026-01-06 00:46:53.620496 | orchestrator | Tuesday 06 January 2026 00:46:38 +0000 (0:00:01.500) 0:01:12.409 ******* 2026-01-06 00:46:53.620503 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620509 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:46:53.620515 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:46:53.620520 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:46:53.620526 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:46:53.620532 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:46:53.620538 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:46:53.620544 | orchestrator | 2026-01-06 00:46:53.620550 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2026-01-06 00:46:53.620556 | orchestrator | Tuesday 06 January 2026 00:46:41 +0000 (0:00:02.478) 0:01:14.887 ******* 2026-01-06 00:46:53.620562 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:53.620567 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:46:53.620573 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:46:53.620579 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:46:53.620585 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:46:53.620591 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:46:53.620596 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:46:53.620602 | orchestrator | 2026-01-06 00:46:53.620608 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2026-01-06 00:46:53.620614 | orchestrator | Tuesday 06 January 2026 00:46:42 +0000 (0:00:01.724) 0:01:16.612 ******* 2026-01-06 00:46:53.620621 | orchestrator | ok: [testbed-manager] 2026-01-06 00:46:53.620632 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:46:53.620638 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:46:53.620644 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:46:53.620650 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:46:53.620655 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:46:53.620661 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:46:53.620667 | orchestrator | 2026-01-06 00:46:53.620672 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2026-01-06 00:46:53.620678 | orchestrator | Tuesday 06 January 2026 00:46:44 +0000 (0:00:01.761) 0:01:18.374 ******* 2026-01-06 00:46:53.620684 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2026-01-06 00:46:53.620693 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:46:53.620700 | orchestrator | 2026-01-06 00:46:53.620705 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2026-01-06 00:46:53.620718 | orchestrator | Tuesday 06 January 2026 00:46:46 +0000 (0:00:01.314) 0:01:19.688 ******* 2026-01-06 00:46:53.620724 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620729 | orchestrator | 2026-01-06 00:46:53.620735 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2026-01-06 00:46:53.620740 | orchestrator | Tuesday 06 January 2026 00:46:49 +0000 (0:00:03.455) 0:01:23.143 ******* 2026-01-06 00:46:53.620746 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:46:53.620752 | orchestrator | changed: [testbed-manager] 2026-01-06 00:46:53.620757 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:46:53.620764 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:46:53.620770 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:46:53.620776 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:46:53.620782 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:46:53.620787 | orchestrator | 2026-01-06 00:46:53.620793 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:46:53.620799 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620807 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620813 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620818 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620833 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620839 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620845 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:46:53.620851 | orchestrator | 2026-01-06 00:46:53.620857 | orchestrator | 2026-01-06 00:46:53.620863 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:46:53.620870 | orchestrator | Tuesday 06 January 2026 00:46:52 +0000 (0:00:03.309) 0:01:26.453 ******* 2026-01-06 00:46:53.620875 | orchestrator | =============================================================================== 2026-01-06 00:46:53.620881 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 40.47s 2026-01-06 00:46:53.620887 | orchestrator | osism.services.netdata : Add repository -------------------------------- 11.98s 2026-01-06 00:46:53.620893 | orchestrator | osism.services.netdata : Copy configuration files ----------------------- 6.25s 2026-01-06 00:46:53.620899 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 3.46s 2026-01-06 00:46:53.620905 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 3.42s 2026-01-06 00:46:53.620911 | orchestrator | osism.services.netdata : Restart service netdata ------------------------ 3.31s 2026-01-06 00:46:53.620917 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.64s 2026-01-06 00:46:53.620922 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 2.48s 2026-01-06 00:46:53.620928 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 1.98s 2026-01-06 00:46:53.620934 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 1.76s 2026-01-06 00:46:53.620940 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 1.72s 2026-01-06 00:46:53.620946 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 1.50s 2026-01-06 00:46:53.620953 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 1.50s 2026-01-06 00:46:53.620965 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 1.31s 2026-01-06 00:46:53.620972 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 1.14s 2026-01-06 00:46:53.620978 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.85s 2026-01-06 00:46:53.621057 | orchestrator | 2026-01-06 00:46:53 | INFO  | Task 6cd5d733-068d-4882-a317-4a4287cffce7 is in state SUCCESS 2026-01-06 00:46:53.621066 | orchestrator | 2026-01-06 00:46:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:56.685256 | orchestrator | 2026-01-06 00:46:56 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:56.689721 | orchestrator | 2026-01-06 00:46:56 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:56.692282 | orchestrator | 2026-01-06 00:46:56 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:56.692401 | orchestrator | 2026-01-06 00:46:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:46:59.743979 | orchestrator | 2026-01-06 00:46:59 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:46:59.747296 | orchestrator | 2026-01-06 00:46:59 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:46:59.750603 | orchestrator | 2026-01-06 00:46:59 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:46:59.750675 | orchestrator | 2026-01-06 00:46:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:02.808766 | orchestrator | 2026-01-06 00:47:02 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:02.813491 | orchestrator | 2026-01-06 00:47:02 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:02.816767 | orchestrator | 2026-01-06 00:47:02 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:02.818088 | orchestrator | 2026-01-06 00:47:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:05.881499 | orchestrator | 2026-01-06 00:47:05 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:05.882483 | orchestrator | 2026-01-06 00:47:05 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:05.883737 | orchestrator | 2026-01-06 00:47:05 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:05.883794 | orchestrator | 2026-01-06 00:47:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:08.928126 | orchestrator | 2026-01-06 00:47:08 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:08.930436 | orchestrator | 2026-01-06 00:47:08 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:08.933112 | orchestrator | 2026-01-06 00:47:08 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:08.933140 | orchestrator | 2026-01-06 00:47:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:11.977006 | orchestrator | 2026-01-06 00:47:11 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:11.979378 | orchestrator | 2026-01-06 00:47:11 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:11.983030 | orchestrator | 2026-01-06 00:47:11 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:11.983071 | orchestrator | 2026-01-06 00:47:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:15.045516 | orchestrator | 2026-01-06 00:47:15 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:15.046819 | orchestrator | 2026-01-06 00:47:15 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:15.047610 | orchestrator | 2026-01-06 00:47:15 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:15.047634 | orchestrator | 2026-01-06 00:47:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:18.106832 | orchestrator | 2026-01-06 00:47:18 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:18.109724 | orchestrator | 2026-01-06 00:47:18 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:18.111972 | orchestrator | 2026-01-06 00:47:18 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:18.112035 | orchestrator | 2026-01-06 00:47:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:21.175485 | orchestrator | 2026-01-06 00:47:21 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:21.178561 | orchestrator | 2026-01-06 00:47:21 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:21.180342 | orchestrator | 2026-01-06 00:47:21 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:21.180756 | orchestrator | 2026-01-06 00:47:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:24.242896 | orchestrator | 2026-01-06 00:47:24 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:24.243947 | orchestrator | 2026-01-06 00:47:24 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:24.247933 | orchestrator | 2026-01-06 00:47:24 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:24.248749 | orchestrator | 2026-01-06 00:47:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:27.293907 | orchestrator | 2026-01-06 00:47:27 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:27.294696 | orchestrator | 2026-01-06 00:47:27 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:27.297319 | orchestrator | 2026-01-06 00:47:27 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:27.297378 | orchestrator | 2026-01-06 00:47:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:30.346805 | orchestrator | 2026-01-06 00:47:30 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:30.349908 | orchestrator | 2026-01-06 00:47:30 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:30.352615 | orchestrator | 2026-01-06 00:47:30 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:30.352678 | orchestrator | 2026-01-06 00:47:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:33.396879 | orchestrator | 2026-01-06 00:47:33 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:33.399831 | orchestrator | 2026-01-06 00:47:33 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:33.404053 | orchestrator | 2026-01-06 00:47:33 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:33.404150 | orchestrator | 2026-01-06 00:47:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:36.447328 | orchestrator | 2026-01-06 00:47:36 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:36.448190 | orchestrator | 2026-01-06 00:47:36 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:36.450448 | orchestrator | 2026-01-06 00:47:36 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:36.450494 | orchestrator | 2026-01-06 00:47:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:39.501914 | orchestrator | 2026-01-06 00:47:39 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:39.506353 | orchestrator | 2026-01-06 00:47:39 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:39.507006 | orchestrator | 2026-01-06 00:47:39 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:39.507058 | orchestrator | 2026-01-06 00:47:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:42.554865 | orchestrator | 2026-01-06 00:47:42 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:42.554951 | orchestrator | 2026-01-06 00:47:42 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:42.556183 | orchestrator | 2026-01-06 00:47:42 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:42.556262 | orchestrator | 2026-01-06 00:47:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:45.602453 | orchestrator | 2026-01-06 00:47:45 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:45.604166 | orchestrator | 2026-01-06 00:47:45 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:45.606172 | orchestrator | 2026-01-06 00:47:45 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:45.606235 | orchestrator | 2026-01-06 00:47:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:48.648382 | orchestrator | 2026-01-06 00:47:48 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:48.651230 | orchestrator | 2026-01-06 00:47:48 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:48.652762 | orchestrator | 2026-01-06 00:47:48 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:48.652784 | orchestrator | 2026-01-06 00:47:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:51.692692 | orchestrator | 2026-01-06 00:47:51 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:51.692786 | orchestrator | 2026-01-06 00:47:51 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:51.694455 | orchestrator | 2026-01-06 00:47:51 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:51.694501 | orchestrator | 2026-01-06 00:47:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:54.734555 | orchestrator | 2026-01-06 00:47:54 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:54.736962 | orchestrator | 2026-01-06 00:47:54 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:54.738722 | orchestrator | 2026-01-06 00:47:54 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:54.738813 | orchestrator | 2026-01-06 00:47:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:47:57.783622 | orchestrator | 2026-01-06 00:47:57 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:47:57.783725 | orchestrator | 2026-01-06 00:47:57 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state STARTED 2026-01-06 00:47:57.786702 | orchestrator | 2026-01-06 00:47:57 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:47:57.786765 | orchestrator | 2026-01-06 00:47:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:00.837528 | orchestrator | 2026-01-06 00:48:00.837641 | orchestrator | 2026-01-06 00:48:00 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:00.837660 | orchestrator | 2026-01-06 00:48:00 | INFO  | Task a0f41d5d-c574-49dc-a7fa-56bb2fa41a42 is in state SUCCESS 2026-01-06 00:48:00.841400 | orchestrator | 2026-01-06 00:48:00.841492 | orchestrator | PLAY [Apply role common] ******************************************************* 2026-01-06 00:48:00.841508 | orchestrator | 2026-01-06 00:48:00.841520 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-01-06 00:48:00.841533 | orchestrator | Tuesday 06 January 2026 00:45:17 +0000 (0:00:00.372) 0:00:00.372 ******* 2026-01-06 00:48:00.841545 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:48:00.841557 | orchestrator | 2026-01-06 00:48:00.841569 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2026-01-06 00:48:00.841580 | orchestrator | Tuesday 06 January 2026 00:45:18 +0000 (0:00:01.296) 0:00:01.669 ******* 2026-01-06 00:48:00.841591 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841601 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841612 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841623 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841634 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841645 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841655 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841666 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841678 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2026-01-06 00:48:00.841689 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841700 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841710 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841721 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841732 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841743 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841754 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-01-06 00:48:00.841765 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841776 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841786 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841806 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841817 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-01-06 00:48:00.841828 | orchestrator | 2026-01-06 00:48:00.841860 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-01-06 00:48:00.841871 | orchestrator | Tuesday 06 January 2026 00:45:22 +0000 (0:00:03.819) 0:00:05.489 ******* 2026-01-06 00:48:00.841882 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:48:00.841894 | orchestrator | 2026-01-06 00:48:00.841906 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2026-01-06 00:48:00.841917 | orchestrator | Tuesday 06 January 2026 00:45:23 +0000 (0:00:01.367) 0:00:06.857 ******* 2026-01-06 00:48:00.841935 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.841954 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.841992 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.842007 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.842094 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.842108 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.842129 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842150 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.842164 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842194 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842208 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842221 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842234 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842259 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842320 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842333 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842345 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842365 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842377 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842388 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842400 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.842411 | orchestrator | 2026-01-06 00:48:00.842422 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2026-01-06 00:48:00.842440 | orchestrator | Tuesday 06 January 2026 00:45:29 +0000 (0:00:05.654) 0:00:12.511 ******* 2026-01-06 00:48:00.842452 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842469 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842481 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842493 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842534 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842546 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.842558 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842576 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842588 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842604 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842616 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842628 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.842640 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842651 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.842663 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.842681 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842693 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842705 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842724 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842736 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842748 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.842759 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842778 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842797 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842809 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.842821 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842832 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.842844 | orchestrator | 2026-01-06 00:48:00.842855 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2026-01-06 00:48:00.842866 | orchestrator | Tuesday 06 January 2026 00:45:32 +0000 (0:00:03.505) 0:00:16.017 ******* 2026-01-06 00:48:00.842884 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842896 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842913 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842925 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842936 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.842948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.842959 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842977 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.842988 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.843000 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843022 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.843034 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843050 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.843063 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843074 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.843085 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843097 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.843108 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.843126 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843144 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843156 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843167 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.843179 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843190 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.843206 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.843218 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843229 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.843241 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.843252 | orchestrator | 2026-01-06 00:48:00.843284 | orchestrator | TASK [common : Ensure /var/log/journal exists on EL10 systems] ***************** 2026-01-06 00:48:00.843296 | orchestrator | Tuesday 06 January 2026 00:45:37 +0000 (0:00:05.035) 0:00:21.052 ******* 2026-01-06 00:48:00.843308 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.843325 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.843336 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.843348 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.843359 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.843376 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.843388 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.843399 | orchestrator | 2026-01-06 00:48:00.843410 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2026-01-06 00:48:00.843421 | orchestrator | Tuesday 06 January 2026 00:45:39 +0000 (0:00:01.131) 0:00:22.184 ******* 2026-01-06 00:48:00.843432 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.843443 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.843454 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.843465 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.843476 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.843487 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.843498 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.843509 | orchestrator | 2026-01-06 00:48:00.843520 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2026-01-06 00:48:00.843531 | orchestrator | Tuesday 06 January 2026 00:45:40 +0000 (0:00:01.428) 0:00:23.613 ******* 2026-01-06 00:48:00.843543 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.843554 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.843565 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.843576 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.843587 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.843598 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.843609 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.843620 | orchestrator | 2026-01-06 00:48:00.843631 | orchestrator | TASK [common : Copying over kolla.target] ************************************** 2026-01-06 00:48:00.843642 | orchestrator | Tuesday 06 January 2026 00:45:42 +0000 (0:00:01.516) 0:00:25.129 ******* 2026-01-06 00:48:00.843653 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.843664 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.843675 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.843686 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.843697 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.843709 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.843719 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.843730 | orchestrator | 2026-01-06 00:48:00.843742 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2026-01-06 00:48:00.843753 | orchestrator | Tuesday 06 January 2026 00:45:45 +0000 (0:00:03.266) 0:00:28.395 ******* 2026-01-06 00:48:00.843764 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843780 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843793 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843811 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843822 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843841 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843853 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843864 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843880 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843893 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843910 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.843922 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843940 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843952 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843963 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843975 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.843987 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844009 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844021 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844033 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844049 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844061 | orchestrator | 2026-01-06 00:48:00.844072 | orchestrator | TASK [common : Find custom fluentd input config files] ************************* 2026-01-06 00:48:00.844083 | orchestrator | Tuesday 06 January 2026 00:45:51 +0000 (0:00:05.914) 0:00:34.310 ******* 2026-01-06 00:48:00.844095 | orchestrator | [WARNING]: Skipped 2026-01-06 00:48:00.844108 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' path due 2026-01-06 00:48:00.844119 | orchestrator | to this access issue: 2026-01-06 00:48:00.844130 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' is not a 2026-01-06 00:48:00.844141 | orchestrator | directory 2026-01-06 00:48:00.844153 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:48:00.844164 | orchestrator | 2026-01-06 00:48:00.844175 | orchestrator | TASK [common : Find custom fluentd filter config files] ************************ 2026-01-06 00:48:00.844186 | orchestrator | Tuesday 06 January 2026 00:45:52 +0000 (0:00:00.849) 0:00:35.159 ******* 2026-01-06 00:48:00.844197 | orchestrator | [WARNING]: Skipped 2026-01-06 00:48:00.844208 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' path due 2026-01-06 00:48:00.844219 | orchestrator | to this access issue: 2026-01-06 00:48:00.844230 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' is not a 2026-01-06 00:48:00.844241 | orchestrator | directory 2026-01-06 00:48:00.844252 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:48:00.844281 | orchestrator | 2026-01-06 00:48:00.844293 | orchestrator | TASK [common : Find custom fluentd format config files] ************************ 2026-01-06 00:48:00.844304 | orchestrator | Tuesday 06 January 2026 00:45:52 +0000 (0:00:00.807) 0:00:35.966 ******* 2026-01-06 00:48:00.844315 | orchestrator | [WARNING]: Skipped 2026-01-06 00:48:00.844327 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' path due 2026-01-06 00:48:00.844338 | orchestrator | to this access issue: 2026-01-06 00:48:00.844349 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' is not a 2026-01-06 00:48:00.844368 | orchestrator | directory 2026-01-06 00:48:00.844380 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:48:00.844391 | orchestrator | 2026-01-06 00:48:00.844402 | orchestrator | TASK [common : Find custom fluentd output config files] ************************ 2026-01-06 00:48:00.844413 | orchestrator | Tuesday 06 January 2026 00:45:53 +0000 (0:00:00.836) 0:00:36.802 ******* 2026-01-06 00:48:00.844424 | orchestrator | [WARNING]: Skipped 2026-01-06 00:48:00.844435 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' path due 2026-01-06 00:48:00.844446 | orchestrator | to this access issue: 2026-01-06 00:48:00.844457 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' is not a 2026-01-06 00:48:00.844469 | orchestrator | directory 2026-01-06 00:48:00.844480 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 00:48:00.844491 | orchestrator | 2026-01-06 00:48:00.844502 | orchestrator | TASK [common : Copying over fluentd.conf] ************************************** 2026-01-06 00:48:00.844513 | orchestrator | Tuesday 06 January 2026 00:45:55 +0000 (0:00:01.506) 0:00:38.309 ******* 2026-01-06 00:48:00.844524 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.844535 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.844546 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.844557 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.844568 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.844580 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.844595 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.844606 | orchestrator | 2026-01-06 00:48:00.844618 | orchestrator | TASK [common : Copying over cron logrotate config file] ************************ 2026-01-06 00:48:00.844629 | orchestrator | Tuesday 06 January 2026 00:46:02 +0000 (0:00:06.946) 0:00:45.256 ******* 2026-01-06 00:48:00.844640 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844651 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844662 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844674 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844685 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844696 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844707 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-01-06 00:48:00.844718 | orchestrator | 2026-01-06 00:48:00.844729 | orchestrator | TASK [common : Ensure RabbitMQ Erlang cookie exists] *************************** 2026-01-06 00:48:00.844740 | orchestrator | Tuesday 06 January 2026 00:46:06 +0000 (0:00:04.267) 0:00:49.524 ******* 2026-01-06 00:48:00.844751 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.844762 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.844773 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.844785 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.844796 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.844807 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.844818 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.844829 | orchestrator | 2026-01-06 00:48:00.844840 | orchestrator | TASK [common : Ensuring config directories have correct owner and permission] *** 2026-01-06 00:48:00.844851 | orchestrator | Tuesday 06 January 2026 00:46:09 +0000 (0:00:02.966) 0:00:52.490 ******* 2026-01-06 00:48:00.844873 | orchestrator | ok: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.844893 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.844905 | orchestrator | ok: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.844917 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.844929 | orchestrator | ok: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844941 | orchestrator | ok: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.844959 | orchestrator | ok: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.844971 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.845001 | orchestrator | ok: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845015 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.845026 | orchestrator | ok: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845038 | orchestrator | ok: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845054 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.845066 | orchestrator | ok: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845078 | orchestrator | ok: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845089 | orchestrator | ok: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845134 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.845147 | orchestrator | ok: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845159 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.845171 | orchestrator | ok: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845187 | orchestrator | ok: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845199 | orchestrator | 2026-01-06 00:48:00.845210 | orchestrator | TASK [common : Copy rabbitmq-env.conf to kolla toolbox] ************************ 2026-01-06 00:48:00.845221 | orchestrator | Tuesday 06 January 2026 00:46:12 +0000 (0:00:03.324) 0:00:55.815 ******* 2026-01-06 00:48:00.845233 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845244 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845255 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845283 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845294 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845306 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845316 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-01-06 00:48:00.845327 | orchestrator | 2026-01-06 00:48:00.845339 | orchestrator | TASK [common : Copy rabbitmq erl_inetrc to kolla toolbox] ********************** 2026-01-06 00:48:00.845356 | orchestrator | Tuesday 06 January 2026 00:46:15 +0000 (0:00:02.610) 0:00:58.425 ******* 2026-01-06 00:48:00.845368 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845379 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845390 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845401 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845412 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845423 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845435 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-01-06 00:48:00.845446 | orchestrator | 2026-01-06 00:48:00.845463 | orchestrator | TASK [service-check-containers : common | Check containers] ******************** 2026-01-06 00:48:00.845474 | orchestrator | Tuesday 06 January 2026 00:46:18 +0000 (0:00:03.196) 0:01:01.621 ******* 2026-01-06 00:48:00.845485 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845497 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845509 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845521 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845537 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845549 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845567 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845594 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845606 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845617 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845629 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845645 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-01-06 00:48:00.845663 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845675 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845692 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845704 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845716 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845727 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845739 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845755 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845773 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:48:00.845784 | orchestrator | 2026-01-06 00:48:00.845796 | orchestrator | TASK [service-check-containers : common | Notify handlers to restart containers] *** 2026-01-06 00:48:00.845807 | orchestrator | Tuesday 06 January 2026 00:46:21 +0000 (0:00:03.382) 0:01:05.003 ******* 2026-01-06 00:48:00.845818 | orchestrator | changed: [testbed-manager] => { 2026-01-06 00:48:00.845829 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.845841 | orchestrator | } 2026-01-06 00:48:00.845852 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:48:00.845863 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.845874 | orchestrator | } 2026-01-06 00:48:00.845885 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:48:00.845896 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.845907 | orchestrator | } 2026-01-06 00:48:00.845918 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:48:00.845928 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.845940 | orchestrator | } 2026-01-06 00:48:00.845951 | orchestrator | changed: [testbed-node-3] => { 2026-01-06 00:48:00.845962 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.845973 | orchestrator | } 2026-01-06 00:48:00.845983 | orchestrator | changed: [testbed-node-4] => { 2026-01-06 00:48:00.845994 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.846006 | orchestrator | } 2026-01-06 00:48:00.846061 | orchestrator | changed: [testbed-node-5] => { 2026-01-06 00:48:00.846077 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:00.846089 | orchestrator | } 2026-01-06 00:48:00.846100 | orchestrator | 2026-01-06 00:48:00.846112 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:48:00.846123 | orchestrator | Tuesday 06 January 2026 00:46:22 +0000 (0:00:01.061) 0:01:06.065 ******* 2026-01-06 00:48:00.846142 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/k2026-01-06 00:48:00 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:00.846154 | orchestrator | 2026-01-06 00:48:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:00.846380 | orchestrator | olla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846484 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846515 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846567 | orchestrator | skipping: [testbed-manager] 2026-01-06 00:48:00.846583 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846610 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846623 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846634 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:00.846646 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846658 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846689 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846701 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:00.846713 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846732 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846744 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846756 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:00.846772 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846785 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846797 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846809 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:48:00.846829 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846841 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846860 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846872 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:48:00.846884 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2025.1', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-01-06 00:48:00.846902 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2025.1', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846915 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2025.1', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:48:00.846927 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:48:00.846939 | orchestrator | 2026-01-06 00:48:00.846951 | orchestrator | TASK [common : Creating log volume] ******************************************** 2026-01-06 00:48:00.846963 | orchestrator | Tuesday 06 January 2026 00:46:24 +0000 (0:00:01.674) 0:01:07.739 ******* 2026-01-06 00:48:00.846975 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.846986 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.846997 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.847008 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.847020 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.847031 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.847042 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.847052 | orchestrator | 2026-01-06 00:48:00.847064 | orchestrator | TASK [common : Link kolla_logs volume to /var/log/kolla] *********************** 2026-01-06 00:48:00.847075 | orchestrator | Tuesday 06 January 2026 00:46:26 +0000 (0:00:01.719) 0:01:09.459 ******* 2026-01-06 00:48:00.847087 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.847098 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.847109 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.847120 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.847131 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.847142 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.847153 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.847164 | orchestrator | 2026-01-06 00:48:00.847175 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847186 | orchestrator | Tuesday 06 January 2026 00:46:27 +0000 (0:00:01.520) 0:01:10.980 ******* 2026-01-06 00:48:00.847197 | orchestrator | 2026-01-06 00:48:00.847208 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847220 | orchestrator | Tuesday 06 January 2026 00:46:27 +0000 (0:00:00.080) 0:01:11.060 ******* 2026-01-06 00:48:00.847238 | orchestrator | 2026-01-06 00:48:00.847249 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847260 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.071) 0:01:11.132 ******* 2026-01-06 00:48:00.847302 | orchestrator | 2026-01-06 00:48:00.847321 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847333 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.067) 0:01:11.200 ******* 2026-01-06 00:48:00.847345 | orchestrator | 2026-01-06 00:48:00.847356 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847367 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.293) 0:01:11.493 ******* 2026-01-06 00:48:00.847379 | orchestrator | 2026-01-06 00:48:00.847391 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847401 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.077) 0:01:11.570 ******* 2026-01-06 00:48:00.847413 | orchestrator | 2026-01-06 00:48:00.847425 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-01-06 00:48:00.847437 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.064) 0:01:11.635 ******* 2026-01-06 00:48:00.847447 | orchestrator | 2026-01-06 00:48:00.847458 | orchestrator | RUNNING HANDLER [common : Restart fluentd container] *************************** 2026-01-06 00:48:00.847470 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:00.085) 0:01:11.721 ******* 2026-01-06 00:48:00.847481 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.847492 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.847503 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.847515 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.847527 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.847538 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.847549 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.847560 | orchestrator | 2026-01-06 00:48:00.847571 | orchestrator | RUNNING HANDLER [common : Restart kolla-toolbox container] ********************* 2026-01-06 00:48:00.847582 | orchestrator | Tuesday 06 January 2026 00:47:07 +0000 (0:00:39.237) 0:01:50.958 ******* 2026-01-06 00:48:00.847593 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.847605 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.847616 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.847627 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.847641 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.847660 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.847679 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.847696 | orchestrator | 2026-01-06 00:48:00.847708 | orchestrator | RUNNING HANDLER [common : Initializing toolbox container using normal user] **** 2026-01-06 00:48:00.847719 | orchestrator | Tuesday 06 January 2026 00:47:48 +0000 (0:00:40.222) 0:02:31.181 ******* 2026-01-06 00:48:00.847730 | orchestrator | ok: [testbed-manager] 2026-01-06 00:48:00.847742 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:48:00.847754 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:48:00.847764 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:48:00.847775 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:48:00.847786 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:48:00.847797 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:48:00.847808 | orchestrator | 2026-01-06 00:48:00.847819 | orchestrator | RUNNING HANDLER [common : Restart cron container] ****************************** 2026-01-06 00:48:00.847830 | orchestrator | Tuesday 06 January 2026 00:47:50 +0000 (0:00:02.047) 0:02:33.228 ******* 2026-01-06 00:48:00.847842 | orchestrator | changed: [testbed-manager] 2026-01-06 00:48:00.847853 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:00.847864 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:48:00.847876 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:00.847887 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:00.847899 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:48:00.847910 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:48:00.847933 | orchestrator | 2026-01-06 00:48:00.847944 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:48:00.847956 | orchestrator | testbed-manager : ok=24  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.847980 | orchestrator | testbed-node-0 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.847993 | orchestrator | testbed-node-1 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.848011 | orchestrator | testbed-node-2 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.848032 | orchestrator | testbed-node-3 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.848054 | orchestrator | testbed-node-4 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.848071 | orchestrator | testbed-node-5 : ok=20  changed=16  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:48:00.848090 | orchestrator | 2026-01-06 00:48:00.848107 | orchestrator | 2026-01-06 00:48:00.848125 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:48:00.848145 | orchestrator | Tuesday 06 January 2026 00:47:59 +0000 (0:00:09.707) 0:02:42.936 ******* 2026-01-06 00:48:00.848164 | orchestrator | =============================================================================== 2026-01-06 00:48:00.848184 | orchestrator | common : Restart kolla-toolbox container ------------------------------- 40.22s 2026-01-06 00:48:00.848203 | orchestrator | common : Restart fluentd container ------------------------------------- 39.24s 2026-01-06 00:48:00.848222 | orchestrator | common : Restart cron container ----------------------------------------- 9.71s 2026-01-06 00:48:00.848241 | orchestrator | common : Copying over fluentd.conf -------------------------------------- 6.95s 2026-01-06 00:48:00.848290 | orchestrator | common : Copying over config.json files for services -------------------- 5.91s 2026-01-06 00:48:00.848309 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 5.65s 2026-01-06 00:48:00.848320 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 5.04s 2026-01-06 00:48:00.848332 | orchestrator | common : Copying over cron logrotate config file ------------------------ 4.27s 2026-01-06 00:48:00.848343 | orchestrator | common : Ensuring config directories exist ------------------------------ 3.82s 2026-01-06 00:48:00.848355 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 3.51s 2026-01-06 00:48:00.848366 | orchestrator | service-check-containers : common | Check containers -------------------- 3.38s 2026-01-06 00:48:00.848378 | orchestrator | common : Ensuring config directories have correct owner and permission --- 3.32s 2026-01-06 00:48:00.848389 | orchestrator | common : Copying over kolla.target -------------------------------------- 3.27s 2026-01-06 00:48:00.848400 | orchestrator | common : Copy rabbitmq erl_inetrc to kolla toolbox ---------------------- 3.20s 2026-01-06 00:48:00.848412 | orchestrator | common : Ensure RabbitMQ Erlang cookie exists --------------------------- 2.97s 2026-01-06 00:48:00.848424 | orchestrator | common : Copy rabbitmq-env.conf to kolla toolbox ------------------------ 2.61s 2026-01-06 00:48:00.848435 | orchestrator | common : Initializing toolbox container using normal user --------------- 2.05s 2026-01-06 00:48:00.848446 | orchestrator | common : Creating log volume -------------------------------------------- 1.72s 2026-01-06 00:48:00.848458 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.67s 2026-01-06 00:48:00.848469 | orchestrator | common : Link kolla_logs volume to /var/log/kolla ----------------------- 1.52s 2026-01-06 00:48:03.886371 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:03.886572 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:03.887407 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:03.888009 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:03.888766 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:03.889703 | orchestrator | 2026-01-06 00:48:03 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:03.889898 | orchestrator | 2026-01-06 00:48:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:06.926245 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:06.926474 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:06.928390 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:06.928420 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:06.929573 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:06.930665 | orchestrator | 2026-01-06 00:48:06 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:06.930769 | orchestrator | 2026-01-06 00:48:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:09.972543 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:09.972791 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:09.978331 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:09.981252 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:09.981759 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:09.982787 | orchestrator | 2026-01-06 00:48:09 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:09.983618 | orchestrator | 2026-01-06 00:48:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:13.090381 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:13.090502 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:13.090518 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:13.090530 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:13.090541 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:13.090552 | orchestrator | 2026-01-06 00:48:13 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:13.090564 | orchestrator | 2026-01-06 00:48:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:16.144241 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:16.144610 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:16.145419 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:16.146853 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:16.146949 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:16.147976 | orchestrator | 2026-01-06 00:48:16 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:16.148876 | orchestrator | 2026-01-06 00:48:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:19.188847 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:19.189755 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:19.191322 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:19.193185 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:19.194592 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:19.196739 | orchestrator | 2026-01-06 00:48:19 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:19.196770 | orchestrator | 2026-01-06 00:48:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:22.240866 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:22.240947 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:22.241102 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:22.241705 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:22.243386 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:22.245574 | orchestrator | 2026-01-06 00:48:22 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:22.245600 | orchestrator | 2026-01-06 00:48:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:25.293468 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:25.293932 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:25.297011 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:25.298688 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state STARTED 2026-01-06 00:48:25.301583 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:25.302195 | orchestrator | 2026-01-06 00:48:25 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:25.302331 | orchestrator | 2026-01-06 00:48:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:28.346823 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:28.349548 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:28.353563 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:28.354235 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task 70cd25c7-522b-4f22-b496-052c0191ca57 is in state SUCCESS 2026-01-06 00:48:28.354622 | orchestrator | 2026-01-06 00:48:28.354653 | orchestrator | 2026-01-06 00:48:28.354663 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:48:28.354672 | orchestrator | 2026-01-06 00:48:28.354681 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:48:28.354690 | orchestrator | Tuesday 06 January 2026 00:48:07 +0000 (0:00:00.256) 0:00:00.256 ******* 2026-01-06 00:48:28.354699 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:48:28.354709 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:48:28.354718 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:48:28.354727 | orchestrator | 2026-01-06 00:48:28.354736 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:48:28.354745 | orchestrator | Tuesday 06 January 2026 00:48:08 +0000 (0:00:00.603) 0:00:00.860 ******* 2026-01-06 00:48:28.354754 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2026-01-06 00:48:28.354762 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2026-01-06 00:48:28.354771 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2026-01-06 00:48:28.354779 | orchestrator | 2026-01-06 00:48:28.354788 | orchestrator | PLAY [Apply role memcached] **************************************************** 2026-01-06 00:48:28.354796 | orchestrator | 2026-01-06 00:48:28.354805 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2026-01-06 00:48:28.354814 | orchestrator | Tuesday 06 January 2026 00:48:09 +0000 (0:00:01.031) 0:00:01.891 ******* 2026-01-06 00:48:28.354822 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:48:28.354832 | orchestrator | 2026-01-06 00:48:28.354840 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2026-01-06 00:48:28.354849 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:00.996) 0:00:02.888 ******* 2026-01-06 00:48:28.354858 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-01-06 00:48:28.354866 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-01-06 00:48:28.354875 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-01-06 00:48:28.354883 | orchestrator | 2026-01-06 00:48:28.354892 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2026-01-06 00:48:28.354900 | orchestrator | Tuesday 06 January 2026 00:48:11 +0000 (0:00:01.255) 0:00:04.143 ******* 2026-01-06 00:48:28.354909 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-01-06 00:48:28.354918 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-01-06 00:48:28.354926 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-01-06 00:48:28.354935 | orchestrator | 2026-01-06 00:48:28.354943 | orchestrator | TASK [service-check-containers : memcached | Check containers] ***************** 2026-01-06 00:48:28.354952 | orchestrator | Tuesday 06 January 2026 00:48:14 +0000 (0:00:02.445) 0:00:06.589 ******* 2026-01-06 00:48:28.354985 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:48:28.355016 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:48:28.355038 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:48:28.355048 | orchestrator | 2026-01-06 00:48:28.355057 | orchestrator | TASK [service-check-containers : memcached | Notify handlers to restart containers] *** 2026-01-06 00:48:28.355066 | orchestrator | Tuesday 06 January 2026 00:48:15 +0000 (0:00:01.576) 0:00:08.165 ******* 2026-01-06 00:48:28.355075 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:48:28.355083 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:28.355092 | orchestrator | } 2026-01-06 00:48:28.355101 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:48:28.355110 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:28.355118 | orchestrator | } 2026-01-06 00:48:28.355127 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:48:28.355136 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:28.355144 | orchestrator | } 2026-01-06 00:48:28.355153 | orchestrator | 2026-01-06 00:48:28.355161 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:48:28.355170 | orchestrator | Tuesday 06 January 2026 00:48:16 +0000 (0:00:00.744) 0:00:08.910 ******* 2026-01-06 00:48:28.355179 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:48:28.355188 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:28.355203 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:48:28.355218 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:28.355243 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:48:28.355289 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:28.355303 | orchestrator | 2026-01-06 00:48:28.355313 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2026-01-06 00:48:28.355323 | orchestrator | Tuesday 06 January 2026 00:48:18 +0000 (0:00:01.714) 0:00:10.625 ******* 2026-01-06 00:48:28.355333 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:28.355343 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:28.355353 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:28.355363 | orchestrator | 2026-01-06 00:48:28.355374 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:48:28.355385 | orchestrator | testbed-node-0 : ok=8  changed=5  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:28.355397 | orchestrator | testbed-node-1 : ok=8  changed=5  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:28.355407 | orchestrator | testbed-node-2 : ok=8  changed=5  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:28.355417 | orchestrator | 2026-01-06 00:48:28.355427 | orchestrator | 2026-01-06 00:48:28.355437 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:48:28.355447 | orchestrator | Tuesday 06 January 2026 00:48:26 +0000 (0:00:08.076) 0:00:18.702 ******* 2026-01-06 00:48:28.355464 | orchestrator | =============================================================================== 2026-01-06 00:48:28.355474 | orchestrator | memcached : Restart memcached container --------------------------------- 8.08s 2026-01-06 00:48:28.355485 | orchestrator | memcached : Copying over config.json files for services ----------------- 2.45s 2026-01-06 00:48:28.355495 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.71s 2026-01-06 00:48:28.355505 | orchestrator | service-check-containers : memcached | Check containers ----------------- 1.58s 2026-01-06 00:48:28.355514 | orchestrator | memcached : Ensuring config directories exist --------------------------- 1.26s 2026-01-06 00:48:28.355524 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.03s 2026-01-06 00:48:28.355534 | orchestrator | memcached : include_tasks ----------------------------------------------- 1.00s 2026-01-06 00:48:28.355545 | orchestrator | service-check-containers : memcached | Notify handlers to restart containers --- 0.74s 2026-01-06 00:48:28.355555 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.60s 2026-01-06 00:48:28.358238 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:28.359002 | orchestrator | 2026-01-06 00:48:28 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:28.359049 | orchestrator | 2026-01-06 00:48:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:31.388186 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:31.388463 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:31.389485 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:31.390400 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:31.390985 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:31.392016 | orchestrator | 2026-01-06 00:48:31 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:31.392057 | orchestrator | 2026-01-06 00:48:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:34.460585 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:34.460725 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:34.460742 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:34.460754 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:34.460765 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:34.460777 | orchestrator | 2026-01-06 00:48:34 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state STARTED 2026-01-06 00:48:34.460789 | orchestrator | 2026-01-06 00:48:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:37.491708 | orchestrator | 2026-01-06 00:48:37.491808 | orchestrator | 2026-01-06 00:48:37.491821 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:48:37.491829 | orchestrator | 2026-01-06 00:48:37.491837 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:48:37.491845 | orchestrator | Tuesday 06 January 2026 00:48:06 +0000 (0:00:00.266) 0:00:00.266 ******* 2026-01-06 00:48:37.491852 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:48:37.491860 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:48:37.491866 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:48:37.491871 | orchestrator | 2026-01-06 00:48:37.491875 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:48:37.491879 | orchestrator | Tuesday 06 January 2026 00:48:07 +0000 (0:00:00.333) 0:00:00.599 ******* 2026-01-06 00:48:37.491884 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2026-01-06 00:48:37.491888 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2026-01-06 00:48:37.491892 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2026-01-06 00:48:37.491896 | orchestrator | 2026-01-06 00:48:37.491900 | orchestrator | PLAY [Apply role redis] ******************************************************** 2026-01-06 00:48:37.491904 | orchestrator | 2026-01-06 00:48:37.491908 | orchestrator | TASK [redis : include_tasks] *************************************************** 2026-01-06 00:48:37.491912 | orchestrator | Tuesday 06 January 2026 00:48:07 +0000 (0:00:00.445) 0:00:01.045 ******* 2026-01-06 00:48:37.491916 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:48:37.491921 | orchestrator | 2026-01-06 00:48:37.491929 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2026-01-06 00:48:37.491933 | orchestrator | Tuesday 06 January 2026 00:48:08 +0000 (0:00:00.944) 0:00:01.989 ******* 2026-01-06 00:48:37.491939 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.491969 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.491974 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.491993 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492012 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492016 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492020 | orchestrator | 2026-01-06 00:48:37.492024 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2026-01-06 00:48:37.492028 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:01.686) 0:00:03.676 ******* 2026-01-06 00:48:37.492032 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492043 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492049 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492055 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492065 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492076 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492083 | orchestrator | 2026-01-06 00:48:37.492089 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2026-01-06 00:48:37.492095 | orchestrator | Tuesday 06 January 2026 00:48:14 +0000 (0:00:03.847) 0:00:07.523 ******* 2026-01-06 00:48:37.492108 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492113 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492117 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492121 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492128 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492138 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492142 | orchestrator | 2026-01-06 00:48:37.492146 | orchestrator | TASK [service-check-containers : redis | Check containers] ********************* 2026-01-06 00:48:37.492149 | orchestrator | Tuesday 06 January 2026 00:48:17 +0000 (0:00:03.386) 0:00:10.910 ******* 2026-01-06 00:48:37.492158 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492162 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492166 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492170 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492177 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492185 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-01-06 00:48:37.492189 | orchestrator | 2026-01-06 00:48:37.492193 | orchestrator | TASK [service-check-containers : redis | Notify handlers to restart containers] *** 2026-01-06 00:48:37.492202 | orchestrator | Tuesday 06 January 2026 00:48:19 +0000 (0:00:02.035) 0:00:12.946 ******* 2026-01-06 00:48:37.492206 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:48:37.492211 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:37.492216 | orchestrator | } 2026-01-06 00:48:37.492220 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:48:37.492225 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:37.492229 | orchestrator | } 2026-01-06 00:48:37.492234 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:48:37.492239 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:48:37.492296 | orchestrator | } 2026-01-06 00:48:37.492301 | orchestrator | 2026-01-06 00:48:37.492305 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:48:37.492310 | orchestrator | Tuesday 06 January 2026 00:48:19 +0000 (0:00:00.414) 0:00:13.360 ******* 2026-01-06 00:48:37.492315 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492319 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492324 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:48:37.492329 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492334 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492339 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:48:37.492349 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2025.1', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492363 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2025.1', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-01-06 00:48:37.492368 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:48:37.492372 | orchestrator | 2026-01-06 00:48:37.492377 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-01-06 00:48:37.492381 | orchestrator | Tuesday 06 January 2026 00:48:21 +0000 (0:00:01.436) 0:00:14.797 ******* 2026-01-06 00:48:37.492385 | orchestrator | 2026-01-06 00:48:37.492390 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-01-06 00:48:37.492395 | orchestrator | Tuesday 06 January 2026 00:48:21 +0000 (0:00:00.057) 0:00:14.854 ******* 2026-01-06 00:48:37.492399 | orchestrator | 2026-01-06 00:48:37.492404 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-01-06 00:48:37.492408 | orchestrator | Tuesday 06 January 2026 00:48:21 +0000 (0:00:00.053) 0:00:14.908 ******* 2026-01-06 00:48:37.492413 | orchestrator | 2026-01-06 00:48:37.492417 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2026-01-06 00:48:37.492422 | orchestrator | Tuesday 06 January 2026 00:48:21 +0000 (0:00:00.057) 0:00:14.966 ******* 2026-01-06 00:48:37.492427 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:37.492431 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:37.492436 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:37.492440 | orchestrator | 2026-01-06 00:48:37.492445 | orchestrator | RUNNING HANDLER [redis : Restart redis-sentinel container] ********************* 2026-01-06 00:48:37.492449 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:04.093) 0:00:19.059 ******* 2026-01-06 00:48:37.492454 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:48:37.492458 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:48:37.492463 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:48:37.492468 | orchestrator | 2026-01-06 00:48:37.492472 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:48:37.492478 | orchestrator | testbed-node-0 : ok=10  changed=7  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:37.492484 | orchestrator | testbed-node-1 : ok=10  changed=7  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:37.492488 | orchestrator | testbed-node-2 : ok=10  changed=7  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:48:37.492493 | orchestrator | 2026-01-06 00:48:37.492498 | orchestrator | 2026-01-06 00:48:37.492502 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:48:37.492507 | orchestrator | Tuesday 06 January 2026 00:48:36 +0000 (0:00:10.642) 0:00:29.702 ******* 2026-01-06 00:48:37.492511 | orchestrator | =============================================================================== 2026-01-06 00:48:37.492516 | orchestrator | redis : Restart redis-sentinel container ------------------------------- 10.64s 2026-01-06 00:48:37.492521 | orchestrator | redis : Restart redis container ----------------------------------------- 4.09s 2026-01-06 00:48:37.492525 | orchestrator | redis : Copying over default config.json files -------------------------- 3.85s 2026-01-06 00:48:37.492530 | orchestrator | redis : Copying over redis config files --------------------------------- 3.39s 2026-01-06 00:48:37.492538 | orchestrator | service-check-containers : redis | Check containers --------------------- 2.04s 2026-01-06 00:48:37.492543 | orchestrator | redis : Ensuring config directories exist ------------------------------- 1.69s 2026-01-06 00:48:37.492548 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.44s 2026-01-06 00:48:37.492552 | orchestrator | redis : include_tasks --------------------------------------------------- 0.94s 2026-01-06 00:48:37.492556 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.45s 2026-01-06 00:48:37.492561 | orchestrator | service-check-containers : redis | Notify handlers to restart containers --- 0.41s 2026-01-06 00:48:37.492566 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.33s 2026-01-06 00:48:37.492570 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.17s 2026-01-06 00:48:37.492578 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:37.492584 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:37.492589 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:37.492593 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:37.492597 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:37.492603 | orchestrator | 2026-01-06 00:48:37 | INFO  | Task 1a089f43-b2c5-4016-967c-1266729ea127 is in state SUCCESS 2026-01-06 00:48:37.492607 | orchestrator | 2026-01-06 00:48:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:40.501415 | orchestrator | 2026-01-06 00:48:40 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:40.501539 | orchestrator | 2026-01-06 00:48:40 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:40.502089 | orchestrator | 2026-01-06 00:48:40 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:40.504533 | orchestrator | 2026-01-06 00:48:40 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:40.504929 | orchestrator | 2026-01-06 00:48:40 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:40.504955 | orchestrator | 2026-01-06 00:48:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:43.536854 | orchestrator | 2026-01-06 00:48:43 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:43.537771 | orchestrator | 2026-01-06 00:48:43 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:43.539555 | orchestrator | 2026-01-06 00:48:43 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:43.540785 | orchestrator | 2026-01-06 00:48:43 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:43.541957 | orchestrator | 2026-01-06 00:48:43 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:43.542060 | orchestrator | 2026-01-06 00:48:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:46.702674 | orchestrator | 2026-01-06 00:48:46 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:46.702780 | orchestrator | 2026-01-06 00:48:46 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:46.702794 | orchestrator | 2026-01-06 00:48:46 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:46.703657 | orchestrator | 2026-01-06 00:48:46 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:46.705065 | orchestrator | 2026-01-06 00:48:46 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:46.705127 | orchestrator | 2026-01-06 00:48:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:49.780000 | orchestrator | 2026-01-06 00:48:49 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:49.780096 | orchestrator | 2026-01-06 00:48:49 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:49.780373 | orchestrator | 2026-01-06 00:48:49 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:49.781366 | orchestrator | 2026-01-06 00:48:49 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:49.781921 | orchestrator | 2026-01-06 00:48:49 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:49.782084 | orchestrator | 2026-01-06 00:48:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:52.811689 | orchestrator | 2026-01-06 00:48:52 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:52.813285 | orchestrator | 2026-01-06 00:48:52 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:52.815971 | orchestrator | 2026-01-06 00:48:52 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:52.816399 | orchestrator | 2026-01-06 00:48:52 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:52.818121 | orchestrator | 2026-01-06 00:48:52 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:52.818402 | orchestrator | 2026-01-06 00:48:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:55.869575 | orchestrator | 2026-01-06 00:48:55 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:55.870362 | orchestrator | 2026-01-06 00:48:55 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:55.871749 | orchestrator | 2026-01-06 00:48:55 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:55.873098 | orchestrator | 2026-01-06 00:48:55 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:55.874461 | orchestrator | 2026-01-06 00:48:55 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:55.874500 | orchestrator | 2026-01-06 00:48:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:48:58.979020 | orchestrator | 2026-01-06 00:48:58 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:48:58.979346 | orchestrator | 2026-01-06 00:48:58 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:48:58.981037 | orchestrator | 2026-01-06 00:48:58 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:48:58.982786 | orchestrator | 2026-01-06 00:48:58 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:48:58.984352 | orchestrator | 2026-01-06 00:48:58 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:48:58.984449 | orchestrator | 2026-01-06 00:48:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:02.026104 | orchestrator | 2026-01-06 00:49:02 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:02.026902 | orchestrator | 2026-01-06 00:49:02 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:02.028782 | orchestrator | 2026-01-06 00:49:02 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:02.030376 | orchestrator | 2026-01-06 00:49:02 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:02.032473 | orchestrator | 2026-01-06 00:49:02 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:02.032568 | orchestrator | 2026-01-06 00:49:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:05.079898 | orchestrator | 2026-01-06 00:49:05 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:05.079985 | orchestrator | 2026-01-06 00:49:05 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:05.079993 | orchestrator | 2026-01-06 00:49:05 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:05.079997 | orchestrator | 2026-01-06 00:49:05 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:05.080001 | orchestrator | 2026-01-06 00:49:05 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:05.080006 | orchestrator | 2026-01-06 00:49:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:08.113031 | orchestrator | 2026-01-06 00:49:08 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:08.113767 | orchestrator | 2026-01-06 00:49:08 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:08.114309 | orchestrator | 2026-01-06 00:49:08 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:08.119122 | orchestrator | 2026-01-06 00:49:08 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:08.120508 | orchestrator | 2026-01-06 00:49:08 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:08.120807 | orchestrator | 2026-01-06 00:49:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:11.167863 | orchestrator | 2026-01-06 00:49:11 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:11.168391 | orchestrator | 2026-01-06 00:49:11 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:11.169531 | orchestrator | 2026-01-06 00:49:11 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:11.170405 | orchestrator | 2026-01-06 00:49:11 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:11.171630 | orchestrator | 2026-01-06 00:49:11 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:11.171696 | orchestrator | 2026-01-06 00:49:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:14.246713 | orchestrator | 2026-01-06 00:49:14 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:14.247196 | orchestrator | 2026-01-06 00:49:14 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:14.248358 | orchestrator | 2026-01-06 00:49:14 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:14.249317 | orchestrator | 2026-01-06 00:49:14 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:14.250399 | orchestrator | 2026-01-06 00:49:14 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:14.250429 | orchestrator | 2026-01-06 00:49:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:17.375807 | orchestrator | 2026-01-06 00:49:17 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:17.397302 | orchestrator | 2026-01-06 00:49:17 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:17.397401 | orchestrator | 2026-01-06 00:49:17 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:17.397415 | orchestrator | 2026-01-06 00:49:17 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:17.398879 | orchestrator | 2026-01-06 00:49:17 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:17.398925 | orchestrator | 2026-01-06 00:49:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:20.436144 | orchestrator | 2026-01-06 00:49:20 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:20.436285 | orchestrator | 2026-01-06 00:49:20 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:20.437446 | orchestrator | 2026-01-06 00:49:20 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:20.439380 | orchestrator | 2026-01-06 00:49:20 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state STARTED 2026-01-06 00:49:20.440045 | orchestrator | 2026-01-06 00:49:20 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:20.440077 | orchestrator | 2026-01-06 00:49:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:23.539717 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:23.539831 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:23.540434 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:23.541236 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:23.542831 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task 4bbe6a45-5c20-4723-bf16-a84ad8567e65 is in state SUCCESS 2026-01-06 00:49:23.544532 | orchestrator | 2026-01-06 00:49:23.544657 | orchestrator | 2026-01-06 00:49:23.544670 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:49:23.544680 | orchestrator | 2026-01-06 00:49:23.544689 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:49:23.544698 | orchestrator | Tuesday 06 January 2026 00:48:08 +0000 (0:00:00.424) 0:00:00.424 ******* 2026-01-06 00:49:23.544708 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:23.544717 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:23.544726 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:23.544734 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:23.544742 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:23.544750 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:23.544758 | orchestrator | 2026-01-06 00:49:23.544766 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:49:23.544774 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:01.508) 0:00:01.932 ******* 2026-01-06 00:49:23.544783 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544791 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544799 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544807 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544815 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544872 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-01-06 00:49:23.544882 | orchestrator | 2026-01-06 00:49:23.544890 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2026-01-06 00:49:23.544898 | orchestrator | 2026-01-06 00:49:23.544920 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2026-01-06 00:49:23.544929 | orchestrator | Tuesday 06 January 2026 00:48:11 +0000 (0:00:01.748) 0:00:03.681 ******* 2026-01-06 00:49:23.544938 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:49:23.544948 | orchestrator | 2026-01-06 00:49:23.544956 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-01-06 00:49:23.544964 | orchestrator | Tuesday 06 January 2026 00:48:13 +0000 (0:00:01.702) 0:00:05.384 ******* 2026-01-06 00:49:23.544972 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-01-06 00:49:23.544980 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-01-06 00:49:23.544988 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-01-06 00:49:23.544996 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-01-06 00:49:23.545004 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-01-06 00:49:23.545012 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-01-06 00:49:23.545020 | orchestrator | 2026-01-06 00:49:23.545027 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-01-06 00:49:23.545035 | orchestrator | Tuesday 06 January 2026 00:48:15 +0000 (0:00:02.233) 0:00:07.617 ******* 2026-01-06 00:49:23.545043 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-01-06 00:49:23.545052 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-01-06 00:49:23.545060 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-01-06 00:49:23.545067 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-01-06 00:49:23.545075 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-01-06 00:49:23.545083 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-01-06 00:49:23.545091 | orchestrator | 2026-01-06 00:49:23.545098 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-01-06 00:49:23.545106 | orchestrator | Tuesday 06 January 2026 00:48:17 +0000 (0:00:02.047) 0:00:09.664 ******* 2026-01-06 00:49:23.545115 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2026-01-06 00:49:23.545125 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:23.545136 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2026-01-06 00:49:23.545146 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:23.545155 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2026-01-06 00:49:23.545164 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:23.545173 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2026-01-06 00:49:23.545183 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.545192 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2026-01-06 00:49:23.545202 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.545232 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2026-01-06 00:49:23.545241 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.545250 | orchestrator | 2026-01-06 00:49:23.545260 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2026-01-06 00:49:23.545269 | orchestrator | Tuesday 06 January 2026 00:48:19 +0000 (0:00:01.507) 0:00:11.172 ******* 2026-01-06 00:49:23.545278 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:23.545287 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:23.545296 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:23.545306 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.545315 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.545332 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.545341 | orchestrator | 2026-01-06 00:49:23.545350 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2026-01-06 00:49:23.545360 | orchestrator | Tuesday 06 January 2026 00:48:20 +0000 (0:00:00.971) 0:00:12.143 ******* 2026-01-06 00:49:23.545385 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545401 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545415 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545425 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545435 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545445 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545466 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545481 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545490 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545498 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545506 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545525 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545533 | orchestrator | 2026-01-06 00:49:23.545542 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2026-01-06 00:49:23.545550 | orchestrator | Tuesday 06 January 2026 00:48:22 +0000 (0:00:01.956) 0:00:14.100 ******* 2026-01-06 00:49:23.545558 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545571 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545580 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545588 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545605 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545620 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545628 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545641 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545649 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545657 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545671 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545686 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545694 | orchestrator | 2026-01-06 00:49:23.545702 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2026-01-06 00:49:23.545710 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:03.336) 0:00:17.436 ******* 2026-01-06 00:49:23.545719 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:23.545727 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:23.545735 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.545742 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:23.545750 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.545758 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.545766 | orchestrator | 2026-01-06 00:49:23.545774 | orchestrator | TASK [service-check-containers : openvswitch | Check containers] *************** 2026-01-06 00:49:23.545783 | orchestrator | Tuesday 06 January 2026 00:48:27 +0000 (0:00:01.825) 0:00:19.262 ******* 2026-01-06 00:49:23.545795 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545804 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545817 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545826 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545840 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545848 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545860 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545869 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545882 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545890 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545904 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545912 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-01-06 00:49:23.545920 | orchestrator | 2026-01-06 00:49:23.545929 | orchestrator | TASK [service-check-containers : openvswitch | Notify handlers to restart containers] *** 2026-01-06 00:49:23.545937 | orchestrator | Tuesday 06 January 2026 00:48:30 +0000 (0:00:03.509) 0:00:22.771 ******* 2026-01-06 00:49:23.545945 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:49:23.545953 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.545962 | orchestrator | } 2026-01-06 00:49:23.545970 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:49:23.545978 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.545986 | orchestrator | } 2026-01-06 00:49:23.545994 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:49:23.546002 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.546010 | orchestrator | } 2026-01-06 00:49:23.546062 | orchestrator | changed: [testbed-node-3] => { 2026-01-06 00:49:23.546081 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.546090 | orchestrator | } 2026-01-06 00:49:23.546100 | orchestrator | changed: [testbed-node-4] => { 2026-01-06 00:49:23.546108 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.546116 | orchestrator | } 2026-01-06 00:49:23.546124 | orchestrator | changed: [testbed-node-5] => { 2026-01-06 00:49:23.546132 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:49:23.546140 | orchestrator | } 2026-01-06 00:49:23.546148 | orchestrator | 2026-01-06 00:49:23.546156 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:49:23.546165 | orchestrator | Tuesday 06 January 2026 00:48:32 +0000 (0:00:01.059) 0:00:23.831 ******* 2026-01-06 00:49:23.546769 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546792 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546811 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546821 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546829 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:23.546837 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:23.546846 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546864 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546879 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546887 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546903 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546912 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:23.546920 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.546929 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546960 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.546969 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-01-06 00:49:23.546977 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2025.1', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-01-06 00:49:23.546985 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.546993 | orchestrator | 2026-01-06 00:49:23.547002 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547011 | orchestrator | Tuesday 06 January 2026 00:48:34 +0000 (0:00:02.591) 0:00:26.422 ******* 2026-01-06 00:49:23.547019 | orchestrator | 2026-01-06 00:49:23.547027 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547035 | orchestrator | Tuesday 06 January 2026 00:48:34 +0000 (0:00:00.158) 0:00:26.581 ******* 2026-01-06 00:49:23.547043 | orchestrator | 2026-01-06 00:49:23.547055 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547063 | orchestrator | Tuesday 06 January 2026 00:48:34 +0000 (0:00:00.158) 0:00:26.739 ******* 2026-01-06 00:49:23.547071 | orchestrator | 2026-01-06 00:49:23.547080 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547088 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.154) 0:00:26.894 ******* 2026-01-06 00:49:23.547096 | orchestrator | 2026-01-06 00:49:23.547104 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547112 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.324) 0:00:27.219 ******* 2026-01-06 00:49:23.547119 | orchestrator | 2026-01-06 00:49:23.547127 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-01-06 00:49:23.547136 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.138) 0:00:27.357 ******* 2026-01-06 00:49:23.547143 | orchestrator | 2026-01-06 00:49:23.547151 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2026-01-06 00:49:23.547159 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.133) 0:00:27.491 ******* 2026-01-06 00:49:23.547168 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:23.547176 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:23.547184 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:23.547192 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:23.547200 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:23.547236 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:23.547245 | orchestrator | 2026-01-06 00:49:23.547253 | orchestrator | RUNNING HANDLER [openvswitch : Waiting for openvswitch_db service to be ready] *** 2026-01-06 00:49:23.547267 | orchestrator | Tuesday 06 January 2026 00:48:44 +0000 (0:00:08.831) 0:00:36.322 ******* 2026-01-06 00:49:23.547284 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:23.547293 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:23.547301 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:23.547308 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:23.547316 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:23.547325 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:23.547333 | orchestrator | 2026-01-06 00:49:23.547340 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2026-01-06 00:49:23.547349 | orchestrator | Tuesday 06 January 2026 00:48:47 +0000 (0:00:02.792) 0:00:39.115 ******* 2026-01-06 00:49:23.547357 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:23.547365 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:23.547373 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:23.547381 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:23.547389 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:23.547397 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:23.547405 | orchestrator | 2026-01-06 00:49:23.547413 | orchestrator | TASK [openvswitch : Set system-id, hostname and hw-offload] ******************** 2026-01-06 00:49:23.547421 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:09.646) 0:00:48.762 ******* 2026-01-06 00:49:23.547429 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-2'}) 2026-01-06 00:49:23.547437 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-4'}) 2026-01-06 00:49:23.547445 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-0'}) 2026-01-06 00:49:23.547454 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-3'}) 2026-01-06 00:49:23.547462 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-1'}) 2026-01-06 00:49:23.547470 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-5'}) 2026-01-06 00:49:23.547478 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-2'}) 2026-01-06 00:49:23.547487 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-4'}) 2026-01-06 00:49:23.547495 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-3'}) 2026-01-06 00:49:23.547503 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-0'}) 2026-01-06 00:49:23.547511 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-5'}) 2026-01-06 00:49:23.547518 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-1'}) 2026-01-06 00:49:23.547526 | orchestrator | ok: [testbed-node-2] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547534 | orchestrator | ok: [testbed-node-3] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547542 | orchestrator | ok: [testbed-node-4] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547550 | orchestrator | ok: [testbed-node-0] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547558 | orchestrator | ok: [testbed-node-5] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547566 | orchestrator | ok: [testbed-node-1] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2026-01-06 00:49:23.547573 | orchestrator | 2026-01-06 00:49:23.547586 | orchestrator | TASK [openvswitch : Ensuring OVS bridge is properly setup] ********************* 2026-01-06 00:49:23.547601 | orchestrator | Tuesday 06 January 2026 00:49:06 +0000 (0:00:09.224) 0:00:57.986 ******* 2026-01-06 00:49:23.547609 | orchestrator | skipping: [testbed-node-3] => (item=br-ex)  2026-01-06 00:49:23.547617 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.547625 | orchestrator | skipping: [testbed-node-4] => (item=br-ex)  2026-01-06 00:49:23.547633 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.547641 | orchestrator | skipping: [testbed-node-5] => (item=br-ex)  2026-01-06 00:49:23.547649 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.547657 | orchestrator | changed: [testbed-node-0] => (item=br-ex) 2026-01-06 00:49:23.547664 | orchestrator | changed: [testbed-node-2] => (item=br-ex) 2026-01-06 00:49:23.547672 | orchestrator | changed: [testbed-node-1] => (item=br-ex) 2026-01-06 00:49:23.547680 | orchestrator | 2026-01-06 00:49:23.547688 | orchestrator | TASK [openvswitch : Ensuring OVS ports are properly setup] ********************* 2026-01-06 00:49:23.547696 | orchestrator | Tuesday 06 January 2026 00:49:09 +0000 (0:00:02.841) 0:01:00.827 ******* 2026-01-06 00:49:23.547704 | orchestrator | skipping: [testbed-node-3] => (item=['br-ex', 'vxlan0'])  2026-01-06 00:49:23.547712 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:23.547720 | orchestrator | skipping: [testbed-node-4] => (item=['br-ex', 'vxlan0'])  2026-01-06 00:49:23.547727 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:23.547735 | orchestrator | skipping: [testbed-node-5] => (item=['br-ex', 'vxlan0'])  2026-01-06 00:49:23.547743 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:23.547751 | orchestrator | changed: [testbed-node-0] => (item=['br-ex', 'vxlan0']) 2026-01-06 00:49:23.547765 | orchestrator | changed: [testbed-node-1] => (item=['br-ex', 'vxlan0']) 2026-01-06 00:49:23.547773 | orchestrator | changed: [testbed-node-2] => (item=['br-ex', 'vxlan0']) 2026-01-06 00:49:23.547781 | orchestrator | 2026-01-06 00:49:23.547789 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2026-01-06 00:49:23.547797 | orchestrator | Tuesday 06 January 2026 00:49:13 +0000 (0:00:04.212) 0:01:05.039 ******* 2026-01-06 00:49:23.547805 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:23.547813 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:23.547821 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:23.547829 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:23.547837 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:23.547845 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:23.547852 | orchestrator | 2026-01-06 00:49:23.547861 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:49:23.547869 | orchestrator | testbed-node-0 : ok=16  changed=12  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:49:23.547879 | orchestrator | testbed-node-1 : ok=16  changed=12  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:49:23.547887 | orchestrator | testbed-node-2 : ok=16  changed=12  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:49:23.547895 | orchestrator | testbed-node-3 : ok=14  changed=10  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:49:23.547903 | orchestrator | testbed-node-4 : ok=14  changed=10  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:49:23.547911 | orchestrator | testbed-node-5 : ok=14  changed=10  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:49:23.547919 | orchestrator | 2026-01-06 00:49:23.547927 | orchestrator | 2026-01-06 00:49:23.547935 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:49:23.547943 | orchestrator | Tuesday 06 January 2026 00:49:21 +0000 (0:00:07.896) 0:01:12.936 ******* 2026-01-06 00:49:23.547958 | orchestrator | =============================================================================== 2026-01-06 00:49:23.547966 | orchestrator | openvswitch : Restart openvswitch-vswitchd container ------------------- 17.54s 2026-01-06 00:49:23.547974 | orchestrator | openvswitch : Set system-id, hostname and hw-offload -------------------- 9.22s 2026-01-06 00:49:23.547982 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------- 8.83s 2026-01-06 00:49:23.547990 | orchestrator | openvswitch : Ensuring OVS ports are properly setup --------------------- 4.21s 2026-01-06 00:49:23.547998 | orchestrator | service-check-containers : openvswitch | Check containers --------------- 3.51s 2026-01-06 00:49:23.548005 | orchestrator | openvswitch : Copying over config.json files for services --------------- 3.34s 2026-01-06 00:49:23.548013 | orchestrator | openvswitch : Ensuring OVS bridge is properly setup --------------------- 2.84s 2026-01-06 00:49:23.548022 | orchestrator | openvswitch : Waiting for openvswitch_db service to be ready ------------ 2.79s 2026-01-06 00:49:23.548029 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.59s 2026-01-06 00:49:23.548037 | orchestrator | module-load : Load modules ---------------------------------------------- 2.23s 2026-01-06 00:49:23.548045 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.05s 2026-01-06 00:49:23.548053 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 1.96s 2026-01-06 00:49:23.548061 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 1.83s 2026-01-06 00:49:23.548069 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.75s 2026-01-06 00:49:23.548081 | orchestrator | openvswitch : include_tasks --------------------------------------------- 1.70s 2026-01-06 00:49:23.548089 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.51s 2026-01-06 00:49:23.548097 | orchestrator | module-load : Drop module persistence ----------------------------------- 1.51s 2026-01-06 00:49:23.548105 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 1.07s 2026-01-06 00:49:23.548113 | orchestrator | service-check-containers : openvswitch | Notify handlers to restart containers --- 1.06s 2026-01-06 00:49:23.548121 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 0.97s 2026-01-06 00:49:23.548129 | orchestrator | 2026-01-06 00:49:23 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:23.548137 | orchestrator | 2026-01-06 00:49:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:26.576441 | orchestrator | 2026-01-06 00:49:26 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:26.577754 | orchestrator | 2026-01-06 00:49:26 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:26.578105 | orchestrator | 2026-01-06 00:49:26 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:26.578682 | orchestrator | 2026-01-06 00:49:26 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:26.583030 | orchestrator | 2026-01-06 00:49:26 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:26.583090 | orchestrator | 2026-01-06 00:49:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:29.621528 | orchestrator | 2026-01-06 00:49:29 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:29.621633 | orchestrator | 2026-01-06 00:49:29 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:29.622263 | orchestrator | 2026-01-06 00:49:29 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:29.622923 | orchestrator | 2026-01-06 00:49:29 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:29.623744 | orchestrator | 2026-01-06 00:49:29 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:29.623801 | orchestrator | 2026-01-06 00:49:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:32.661508 | orchestrator | 2026-01-06 00:49:32 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:32.661602 | orchestrator | 2026-01-06 00:49:32 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:32.663739 | orchestrator | 2026-01-06 00:49:32 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:32.664110 | orchestrator | 2026-01-06 00:49:32 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:32.664748 | orchestrator | 2026-01-06 00:49:32 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:32.664781 | orchestrator | 2026-01-06 00:49:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:35.743637 | orchestrator | 2026-01-06 00:49:35 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:35.743745 | orchestrator | 2026-01-06 00:49:35 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:35.743760 | orchestrator | 2026-01-06 00:49:35 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:35.743772 | orchestrator | 2026-01-06 00:49:35 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:35.743783 | orchestrator | 2026-01-06 00:49:35 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:35.743795 | orchestrator | 2026-01-06 00:49:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:39.014748 | orchestrator | 2026-01-06 00:49:39 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:39.014915 | orchestrator | 2026-01-06 00:49:39 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:39.016408 | orchestrator | 2026-01-06 00:49:39 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:39.022974 | orchestrator | 2026-01-06 00:49:39 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:39.023076 | orchestrator | 2026-01-06 00:49:39 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:39.023126 | orchestrator | 2026-01-06 00:49:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:42.053287 | orchestrator | 2026-01-06 00:49:42 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:42.054170 | orchestrator | 2026-01-06 00:49:42 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:42.056527 | orchestrator | 2026-01-06 00:49:42 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:42.057705 | orchestrator | 2026-01-06 00:49:42 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:42.059210 | orchestrator | 2026-01-06 00:49:42 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:42.059247 | orchestrator | 2026-01-06 00:49:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:45.199725 | orchestrator | 2026-01-06 00:49:45 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state STARTED 2026-01-06 00:49:45.199877 | orchestrator | 2026-01-06 00:49:45 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:45.200364 | orchestrator | 2026-01-06 00:49:45 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:45.201029 | orchestrator | 2026-01-06 00:49:45 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:45.201754 | orchestrator | 2026-01-06 00:49:45 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:45.201776 | orchestrator | 2026-01-06 00:49:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:48.225306 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task df615095-f4f8-4593-8d99-8881458f5087 is in state SUCCESS 2026-01-06 00:49:48.226760 | orchestrator | 2026-01-06 00:49:48.226802 | orchestrator | 2026-01-06 00:49:48.226808 | orchestrator | PLAY [Prepare all k3s nodes] *************************************************** 2026-01-06 00:49:48.226816 | orchestrator | 2026-01-06 00:49:48.226823 | orchestrator | TASK [k3s_prereq : Validating arguments against arg spec 'main' - Prerequisites] *** 2026-01-06 00:49:48.226830 | orchestrator | Tuesday 06 January 2026 00:45:17 +0000 (0:00:00.168) 0:00:00.168 ******* 2026-01-06 00:49:48.226837 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.226844 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.226850 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.226856 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.226862 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.226868 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.226874 | orchestrator | 2026-01-06 00:49:48.226880 | orchestrator | TASK [k3s_prereq : Set same timezone on every Server] ************************** 2026-01-06 00:49:48.226887 | orchestrator | Tuesday 06 January 2026 00:45:17 +0000 (0:00:00.755) 0:00:00.924 ******* 2026-01-06 00:49:48.226893 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.226900 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.226907 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.226913 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.226920 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.226925 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.226929 | orchestrator | 2026-01-06 00:49:48.226933 | orchestrator | TASK [k3s_prereq : Set SELinux to disabled state] ****************************** 2026-01-06 00:49:48.226937 | orchestrator | Tuesday 06 January 2026 00:45:18 +0000 (0:00:00.719) 0:00:01.643 ******* 2026-01-06 00:49:48.226941 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.226945 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.226949 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.226953 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.226957 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.226961 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.226964 | orchestrator | 2026-01-06 00:49:48.226969 | orchestrator | TASK [k3s_prereq : Enable IPv4 forwarding] ************************************* 2026-01-06 00:49:48.226973 | orchestrator | Tuesday 06 January 2026 00:45:19 +0000 (0:00:00.831) 0:00:02.474 ******* 2026-01-06 00:49:48.226977 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.226980 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.226984 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.226988 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.226991 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.226995 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.226999 | orchestrator | 2026-01-06 00:49:48.227003 | orchestrator | TASK [k3s_prereq : Enable IPv6 forwarding] ************************************* 2026-01-06 00:49:48.227007 | orchestrator | Tuesday 06 January 2026 00:45:21 +0000 (0:00:01.961) 0:00:04.436 ******* 2026-01-06 00:49:48.227010 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.227014 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.227018 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.227022 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.227025 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.227029 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.227033 | orchestrator | 2026-01-06 00:49:48.227037 | orchestrator | TASK [k3s_prereq : Enable IPv6 router advertisements] ************************** 2026-01-06 00:49:48.227057 | orchestrator | Tuesday 06 January 2026 00:45:22 +0000 (0:00:01.445) 0:00:05.882 ******* 2026-01-06 00:49:48.227061 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.227064 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.227068 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.227072 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.227076 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.227079 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.227083 | orchestrator | 2026-01-06 00:49:48.227087 | orchestrator | TASK [k3s_prereq : Add br_netfilter to /etc/modules-load.d/] ******************* 2026-01-06 00:49:48.227096 | orchestrator | Tuesday 06 January 2026 00:45:23 +0000 (0:00:01.189) 0:00:07.071 ******* 2026-01-06 00:49:48.227100 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227103 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227107 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227111 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227114 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227118 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227122 | orchestrator | 2026-01-06 00:49:48.227125 | orchestrator | TASK [k3s_prereq : Load br_netfilter] ****************************************** 2026-01-06 00:49:48.227129 | orchestrator | Tuesday 06 January 2026 00:45:24 +0000 (0:00:00.801) 0:00:07.873 ******* 2026-01-06 00:49:48.227133 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227137 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227141 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227144 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227206 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227210 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227214 | orchestrator | 2026-01-06 00:49:48.227218 | orchestrator | TASK [k3s_prereq : Set bridge-nf-call-iptables (just to be sure)] ************** 2026-01-06 00:49:48.227222 | orchestrator | Tuesday 06 January 2026 00:45:25 +0000 (0:00:00.951) 0:00:08.825 ******* 2026-01-06 00:49:48.227226 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227230 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227234 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227238 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227242 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227246 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227250 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227254 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227260 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227266 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227284 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227291 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227297 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227303 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227309 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227315 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-iptables)  2026-01-06 00:49:48.227321 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-01-06 00:49:48.227327 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227333 | orchestrator | 2026-01-06 00:49:48.227340 | orchestrator | TASK [k3s_prereq : Add /usr/local/bin to sudo secure_path] ********************* 2026-01-06 00:49:48.227347 | orchestrator | Tuesday 06 January 2026 00:45:26 +0000 (0:00:00.728) 0:00:09.554 ******* 2026-01-06 00:49:48.227359 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227365 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227371 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227377 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227383 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227389 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227395 | orchestrator | 2026-01-06 00:49:48.227401 | orchestrator | TASK [k3s_download : Validating arguments against arg spec 'main' - Manage the downloading of K3S binaries] *** 2026-01-06 00:49:48.227410 | orchestrator | Tuesday 06 January 2026 00:45:28 +0000 (0:00:01.610) 0:00:11.164 ******* 2026-01-06 00:49:48.227416 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.227422 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.227429 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.227434 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.227439 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.227443 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.227447 | orchestrator | 2026-01-06 00:49:48.227452 | orchestrator | TASK [k3s_download : Download k3s binary x64] ********************************** 2026-01-06 00:49:48.227456 | orchestrator | Tuesday 06 January 2026 00:45:28 +0000 (0:00:00.817) 0:00:11.982 ******* 2026-01-06 00:49:48.227461 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.227465 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.227469 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.227474 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.227478 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.227482 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.227487 | orchestrator | 2026-01-06 00:49:48.227491 | orchestrator | TASK [k3s_download : Download k3s binary arm64] ******************************** 2026-01-06 00:49:48.227495 | orchestrator | Tuesday 06 January 2026 00:45:34 +0000 (0:00:06.089) 0:00:18.071 ******* 2026-01-06 00:49:48.227500 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227504 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227509 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227513 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227518 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227522 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227527 | orchestrator | 2026-01-06 00:49:48.227531 | orchestrator | TASK [k3s_download : Download k3s binary armhf] ******************************** 2026-01-06 00:49:48.227536 | orchestrator | Tuesday 06 January 2026 00:45:36 +0000 (0:00:01.734) 0:00:19.806 ******* 2026-01-06 00:49:48.227540 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227544 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227549 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227553 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227558 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227562 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227566 | orchestrator | 2026-01-06 00:49:48.227575 | orchestrator | TASK [k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry] *** 2026-01-06 00:49:48.227581 | orchestrator | Tuesday 06 January 2026 00:45:38 +0000 (0:00:02.300) 0:00:22.107 ******* 2026-01-06 00:49:48.227586 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227591 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227595 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227599 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227604 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227608 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227612 | orchestrator | 2026-01-06 00:49:48.227617 | orchestrator | TASK [k3s_custom_registries : Create directory /etc/rancher/k3s] *************** 2026-01-06 00:49:48.227622 | orchestrator | Tuesday 06 January 2026 00:45:39 +0000 (0:00:00.915) 0:00:23.022 ******* 2026-01-06 00:49:48.227626 | orchestrator | skipping: [testbed-node-3] => (item=rancher)  2026-01-06 00:49:48.227631 | orchestrator | skipping: [testbed-node-3] => (item=rancher/k3s)  2026-01-06 00:49:48.227638 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227643 | orchestrator | skipping: [testbed-node-4] => (item=rancher)  2026-01-06 00:49:48.227647 | orchestrator | skipping: [testbed-node-4] => (item=rancher/k3s)  2026-01-06 00:49:48.227652 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227656 | orchestrator | skipping: [testbed-node-5] => (item=rancher)  2026-01-06 00:49:48.227660 | orchestrator | skipping: [testbed-node-5] => (item=rancher/k3s)  2026-01-06 00:49:48.227665 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227669 | orchestrator | skipping: [testbed-node-0] => (item=rancher)  2026-01-06 00:49:48.227674 | orchestrator | skipping: [testbed-node-0] => (item=rancher/k3s)  2026-01-06 00:49:48.227678 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227682 | orchestrator | skipping: [testbed-node-1] => (item=rancher)  2026-01-06 00:49:48.227687 | orchestrator | skipping: [testbed-node-1] => (item=rancher/k3s)  2026-01-06 00:49:48.227691 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227695 | orchestrator | skipping: [testbed-node-2] => (item=rancher)  2026-01-06 00:49:48.227699 | orchestrator | skipping: [testbed-node-2] => (item=rancher/k3s)  2026-01-06 00:49:48.227703 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227707 | orchestrator | 2026-01-06 00:49:48.227711 | orchestrator | TASK [k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml] *** 2026-01-06 00:49:48.227718 | orchestrator | Tuesday 06 January 2026 00:45:41 +0000 (0:00:01.892) 0:00:24.915 ******* 2026-01-06 00:49:48.227722 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227726 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227730 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227734 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227737 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227741 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227745 | orchestrator | 2026-01-06 00:49:48.227749 | orchestrator | TASK [k3s_custom_registries : Remove /etc/rancher/k3s/registries.yaml when no registries configured] *** 2026-01-06 00:49:48.227753 | orchestrator | Tuesday 06 January 2026 00:45:43 +0000 (0:00:01.476) 0:00:26.392 ******* 2026-01-06 00:49:48.227757 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.227761 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.227764 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.227768 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227772 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227776 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227780 | orchestrator | 2026-01-06 00:49:48.227783 | orchestrator | PLAY [Deploy k3s master nodes] ************************************************* 2026-01-06 00:49:48.227787 | orchestrator | 2026-01-06 00:49:48.227791 | orchestrator | TASK [k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers] *** 2026-01-06 00:49:48.227795 | orchestrator | Tuesday 06 January 2026 00:45:45 +0000 (0:00:01.750) 0:00:28.142 ******* 2026-01-06 00:49:48.227799 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.227803 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.227807 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.227810 | orchestrator | 2026-01-06 00:49:48.227814 | orchestrator | TASK [k3s_server : Stop k3s-init] ********************************************** 2026-01-06 00:49:48.227818 | orchestrator | Tuesday 06 January 2026 00:45:47 +0000 (0:00:02.523) 0:00:30.666 ******* 2026-01-06 00:49:48.227822 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.227827 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.227833 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.227839 | orchestrator | 2026-01-06 00:49:48.227845 | orchestrator | TASK [k3s_server : Stop k3s] *************************************************** 2026-01-06 00:49:48.227851 | orchestrator | Tuesday 06 January 2026 00:45:48 +0000 (0:00:01.428) 0:00:32.094 ******* 2026-01-06 00:49:48.227857 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.227863 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.227868 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.227878 | orchestrator | 2026-01-06 00:49:48.227884 | orchestrator | TASK [k3s_server : Clean previous runs of k3s-init] **************************** 2026-01-06 00:49:48.227889 | orchestrator | Tuesday 06 January 2026 00:45:50 +0000 (0:00:01.087) 0:00:33.181 ******* 2026-01-06 00:49:48.227895 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.227900 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.227906 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.227912 | orchestrator | 2026-01-06 00:49:48.227918 | orchestrator | TASK [k3s_server : Deploy K3s http_proxy conf] ********************************* 2026-01-06 00:49:48.227924 | orchestrator | Tuesday 06 January 2026 00:45:50 +0000 (0:00:00.882) 0:00:34.063 ******* 2026-01-06 00:49:48.227930 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.227936 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.227990 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.227996 | orchestrator | 2026-01-06 00:49:48.228003 | orchestrator | TASK [k3s_server : Create /etc/rancher/k3s directory] ************************** 2026-01-06 00:49:48.228009 | orchestrator | Tuesday 06 January 2026 00:45:51 +0000 (0:00:00.371) 0:00:34.435 ******* 2026-01-06 00:49:48.228015 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228021 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228027 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228033 | orchestrator | 2026-01-06 00:49:48.228038 | orchestrator | TASK [k3s_server : Create custom resolv.conf for k3s] ************************** 2026-01-06 00:49:48.228045 | orchestrator | Tuesday 06 January 2026 00:45:52 +0000 (0:00:00.949) 0:00:35.384 ******* 2026-01-06 00:49:48.228056 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228063 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228069 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228075 | orchestrator | 2026-01-06 00:49:48.228082 | orchestrator | TASK [k3s_server : Deploy vip manifest] **************************************** 2026-01-06 00:49:48.228088 | orchestrator | Tuesday 06 January 2026 00:45:53 +0000 (0:00:01.743) 0:00:37.127 ******* 2026-01-06 00:49:48.228095 | orchestrator | included: /ansible/roles/k3s_server/tasks/vip.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:49:48.228102 | orchestrator | 2026-01-06 00:49:48.228108 | orchestrator | TASK [k3s_server : Set _kube_vip_bgp_peers fact] ******************************* 2026-01-06 00:49:48.228114 | orchestrator | Tuesday 06 January 2026 00:45:54 +0000 (0:00:00.955) 0:00:38.083 ******* 2026-01-06 00:49:48.228120 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228127 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228134 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228140 | orchestrator | 2026-01-06 00:49:48.228147 | orchestrator | TASK [k3s_server : Create manifests directory on first master] ***************** 2026-01-06 00:49:48.228153 | orchestrator | Tuesday 06 January 2026 00:45:58 +0000 (0:00:03.994) 0:00:42.077 ******* 2026-01-06 00:49:48.228159 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228165 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228172 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228179 | orchestrator | 2026-01-06 00:49:48.228237 | orchestrator | TASK [k3s_server : Download vip rbac manifest to first master] ***************** 2026-01-06 00:49:48.228242 | orchestrator | Tuesday 06 January 2026 00:46:00 +0000 (0:00:01.177) 0:00:43.255 ******* 2026-01-06 00:49:48.228245 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228249 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228253 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228257 | orchestrator | 2026-01-06 00:49:48.228261 | orchestrator | TASK [k3s_server : Copy vip manifest to first master] ************************** 2026-01-06 00:49:48.228265 | orchestrator | Tuesday 06 January 2026 00:46:01 +0000 (0:00:01.722) 0:00:44.977 ******* 2026-01-06 00:49:48.228269 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228273 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228276 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228280 | orchestrator | 2026-01-06 00:49:48.228284 | orchestrator | TASK [k3s_server : Deploy metallb manifest] ************************************ 2026-01-06 00:49:48.228304 | orchestrator | Tuesday 06 January 2026 00:46:03 +0000 (0:00:02.117) 0:00:47.094 ******* 2026-01-06 00:49:48.228308 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.228312 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228316 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228320 | orchestrator | 2026-01-06 00:49:48.228324 | orchestrator | TASK [k3s_server : Deploy kube-vip manifest] *********************************** 2026-01-06 00:49:48.228327 | orchestrator | Tuesday 06 January 2026 00:46:04 +0000 (0:00:00.881) 0:00:47.976 ******* 2026-01-06 00:49:48.228331 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.228337 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228343 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228348 | orchestrator | 2026-01-06 00:49:48.228352 | orchestrator | TASK [k3s_server : Init cluster inside the transient k3s-init service] ********* 2026-01-06 00:49:48.228356 | orchestrator | Tuesday 06 January 2026 00:46:05 +0000 (0:00:00.318) 0:00:48.294 ******* 2026-01-06 00:49:48.228360 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228363 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228367 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228371 | orchestrator | 2026-01-06 00:49:48.228375 | orchestrator | TASK [k3s_server : Detect Kubernetes version for label compatibility] ********** 2026-01-06 00:49:48.228378 | orchestrator | Tuesday 06 January 2026 00:46:06 +0000 (0:00:01.552) 0:00:49.847 ******* 2026-01-06 00:49:48.228382 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228386 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228390 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228393 | orchestrator | 2026-01-06 00:49:48.228397 | orchestrator | TASK [k3s_server : Set node role label selector based on Kubernetes version] *** 2026-01-06 00:49:48.228401 | orchestrator | Tuesday 06 January 2026 00:46:09 +0000 (0:00:03.017) 0:00:52.864 ******* 2026-01-06 00:49:48.228405 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228408 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228412 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228416 | orchestrator | 2026-01-06 00:49:48.228420 | orchestrator | TASK [k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails)] *** 2026-01-06 00:49:48.228424 | orchestrator | Tuesday 06 January 2026 00:46:10 +0000 (0:00:01.164) 0:00:54.028 ******* 2026-01-06 00:49:48.228428 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-01-06 00:49:48.228433 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-01-06 00:49:48.228437 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-01-06 00:49:48.228441 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-01-06 00:49:48.228444 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-01-06 00:49:48.228448 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-01-06 00:49:48.228452 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-01-06 00:49:48.228488 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-01-06 00:49:48.228494 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-01-06 00:49:48.228497 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-01-06 00:49:48.228506 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-01-06 00:49:48.228510 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-01-06 00:49:48.228514 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228518 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228522 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228525 | orchestrator | 2026-01-06 00:49:48.228529 | orchestrator | TASK [k3s_server : Save logs of k3s-init.service] ****************************** 2026-01-06 00:49:48.228533 | orchestrator | Tuesday 06 January 2026 00:46:55 +0000 (0:00:44.115) 0:01:38.143 ******* 2026-01-06 00:49:48.228537 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.228541 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228544 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228548 | orchestrator | 2026-01-06 00:49:48.228552 | orchestrator | TASK [k3s_server : Kill the temporary service used for initialization] ********* 2026-01-06 00:49:48.228556 | orchestrator | Tuesday 06 January 2026 00:46:55 +0000 (0:00:00.324) 0:01:38.467 ******* 2026-01-06 00:49:48.228559 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228563 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228567 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228571 | orchestrator | 2026-01-06 00:49:48.228574 | orchestrator | TASK [k3s_server : Copy K3s service file] ************************************** 2026-01-06 00:49:48.228578 | orchestrator | Tuesday 06 January 2026 00:46:56 +0000 (0:00:01.080) 0:01:39.547 ******* 2026-01-06 00:49:48.228582 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228586 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228589 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228593 | orchestrator | 2026-01-06 00:49:48.228600 | orchestrator | TASK [k3s_server : Enable and check K3s service] ******************************* 2026-01-06 00:49:48.228604 | orchestrator | Tuesday 06 January 2026 00:46:57 +0000 (0:00:01.563) 0:01:41.111 ******* 2026-01-06 00:49:48.228608 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228612 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228616 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228620 | orchestrator | 2026-01-06 00:49:48.228624 | orchestrator | TASK [k3s_server : Wait for node-token] **************************************** 2026-01-06 00:49:48.228627 | orchestrator | Tuesday 06 January 2026 00:47:20 +0000 (0:00:22.642) 0:02:03.754 ******* 2026-01-06 00:49:48.228631 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228635 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228639 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228643 | orchestrator | 2026-01-06 00:49:48.228646 | orchestrator | TASK [k3s_server : Register node-token file access mode] *********************** 2026-01-06 00:49:48.228650 | orchestrator | Tuesday 06 January 2026 00:47:21 +0000 (0:00:00.953) 0:02:04.708 ******* 2026-01-06 00:49:48.228654 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228658 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228662 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228665 | orchestrator | 2026-01-06 00:49:48.228669 | orchestrator | TASK [k3s_server : Change file access node-token] ****************************** 2026-01-06 00:49:48.228673 | orchestrator | Tuesday 06 January 2026 00:47:22 +0000 (0:00:00.893) 0:02:05.601 ******* 2026-01-06 00:49:48.228677 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228681 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228684 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228688 | orchestrator | 2026-01-06 00:49:48.228692 | orchestrator | TASK [k3s_server : Read node-token from master] ******************************** 2026-01-06 00:49:48.228696 | orchestrator | Tuesday 06 January 2026 00:47:23 +0000 (0:00:00.798) 0:02:06.400 ******* 2026-01-06 00:49:48.228700 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228703 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228707 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228711 | orchestrator | 2026-01-06 00:49:48.228719 | orchestrator | TASK [k3s_server : Store Master node-token] ************************************ 2026-01-06 00:49:48.228723 | orchestrator | Tuesday 06 January 2026 00:47:24 +0000 (0:00:01.177) 0:02:07.577 ******* 2026-01-06 00:49:48.228727 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228730 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228734 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228738 | orchestrator | 2026-01-06 00:49:48.228742 | orchestrator | TASK [k3s_server : Restore node-token file access] ***************************** 2026-01-06 00:49:48.228745 | orchestrator | Tuesday 06 January 2026 00:47:24 +0000 (0:00:00.336) 0:02:07.914 ******* 2026-01-06 00:49:48.228749 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228753 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228757 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228761 | orchestrator | 2026-01-06 00:49:48.228765 | orchestrator | TASK [k3s_server : Create directory .kube] ************************************* 2026-01-06 00:49:48.228769 | orchestrator | Tuesday 06 January 2026 00:47:25 +0000 (0:00:00.733) 0:02:08.648 ******* 2026-01-06 00:49:48.228772 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228776 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228780 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228784 | orchestrator | 2026-01-06 00:49:48.228787 | orchestrator | TASK [k3s_server : Copy config file to user home directory] ******************** 2026-01-06 00:49:48.228791 | orchestrator | Tuesday 06 January 2026 00:47:26 +0000 (0:00:00.718) 0:02:09.366 ******* 2026-01-06 00:49:48.228795 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228801 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228807 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228813 | orchestrator | 2026-01-06 00:49:48.228820 | orchestrator | TASK [k3s_server : Configure kubectl cluster to https://192.168.16.8:6443] ***** 2026-01-06 00:49:48.228830 | orchestrator | Tuesday 06 January 2026 00:47:27 +0000 (0:00:01.247) 0:02:10.613 ******* 2026-01-06 00:49:48.228834 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:49:48.228838 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:49:48.228842 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:49:48.228846 | orchestrator | 2026-01-06 00:49:48.228850 | orchestrator | TASK [k3s_server : Create kubectl symlink] ************************************* 2026-01-06 00:49:48.228853 | orchestrator | Tuesday 06 January 2026 00:47:28 +0000 (0:00:00.886) 0:02:11.500 ******* 2026-01-06 00:49:48.228857 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.228861 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228865 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228868 | orchestrator | 2026-01-06 00:49:48.228872 | orchestrator | TASK [k3s_server : Create crictl symlink] ************************************** 2026-01-06 00:49:48.228876 | orchestrator | Tuesday 06 January 2026 00:47:28 +0000 (0:00:00.313) 0:02:11.814 ******* 2026-01-06 00:49:48.228880 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.228884 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.228887 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.228891 | orchestrator | 2026-01-06 00:49:48.228895 | orchestrator | TASK [k3s_server : Get contents of manifests folder] *************************** 2026-01-06 00:49:48.228899 | orchestrator | Tuesday 06 January 2026 00:47:28 +0000 (0:00:00.288) 0:02:12.102 ******* 2026-01-06 00:49:48.228905 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228911 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228917 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228923 | orchestrator | 2026-01-06 00:49:48.228929 | orchestrator | TASK [k3s_server : Get sub dirs of manifests folder] *************************** 2026-01-06 00:49:48.228935 | orchestrator | Tuesday 06 January 2026 00:47:29 +0000 (0:00:00.951) 0:02:13.054 ******* 2026-01-06 00:49:48.228941 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.228947 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.228952 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.228957 | orchestrator | 2026-01-06 00:49:48.228964 | orchestrator | TASK [k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start] *** 2026-01-06 00:49:48.228975 | orchestrator | Tuesday 06 January 2026 00:47:30 +0000 (0:00:00.688) 0:02:13.743 ******* 2026-01-06 00:49:48.228982 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-01-06 00:49:48.228992 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-01-06 00:49:48.228997 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-01-06 00:49:48.229003 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-01-06 00:49:48.229009 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-01-06 00:49:48.229014 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-01-06 00:49:48.229020 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-01-06 00:49:48.229027 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-01-06 00:49:48.229033 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-01-06 00:49:48.229039 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip.yaml) 2026-01-06 00:49:48.229045 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-01-06 00:49:48.229050 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-01-06 00:49:48.229056 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip-rbac.yaml) 2026-01-06 00:49:48.229062 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-01-06 00:49:48.229068 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-01-06 00:49:48.229073 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-01-06 00:49:48.229079 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-01-06 00:49:48.229085 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-01-06 00:49:48.229091 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-01-06 00:49:48.229096 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-01-06 00:49:48.229102 | orchestrator | 2026-01-06 00:49:48.229108 | orchestrator | PLAY [Deploy k3s worker nodes] ************************************************* 2026-01-06 00:49:48.229113 | orchestrator | 2026-01-06 00:49:48.229119 | orchestrator | TASK [k3s_agent : Validating arguments against arg spec 'main' - Setup k3s agents] *** 2026-01-06 00:49:48.229125 | orchestrator | Tuesday 06 January 2026 00:47:33 +0000 (0:00:03.090) 0:02:16.834 ******* 2026-01-06 00:49:48.229131 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.229136 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.229142 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.229147 | orchestrator | 2026-01-06 00:49:48.229153 | orchestrator | TASK [k3s_agent : Check if system is PXE-booted] ******************************* 2026-01-06 00:49:48.229158 | orchestrator | Tuesday 06 January 2026 00:47:34 +0000 (0:00:00.581) 0:02:17.415 ******* 2026-01-06 00:49:48.229163 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.229169 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.229174 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.229179 | orchestrator | 2026-01-06 00:49:48.229206 | orchestrator | TASK [k3s_agent : Set fact for PXE-booted system] ****************************** 2026-01-06 00:49:48.229212 | orchestrator | Tuesday 06 January 2026 00:47:34 +0000 (0:00:00.699) 0:02:18.115 ******* 2026-01-06 00:49:48.229218 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.229224 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.229240 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.229246 | orchestrator | 2026-01-06 00:49:48.229252 | orchestrator | TASK [k3s_agent : Include http_proxy configuration tasks] ********************** 2026-01-06 00:49:48.229258 | orchestrator | Tuesday 06 January 2026 00:47:35 +0000 (0:00:00.347) 0:02:18.463 ******* 2026-01-06 00:49:48.229265 | orchestrator | included: /ansible/roles/k3s_agent/tasks/http_proxy.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:49:48.229271 | orchestrator | 2026-01-06 00:49:48.229277 | orchestrator | TASK [k3s_agent : Create k3s-node.service.d directory] ************************* 2026-01-06 00:49:48.229283 | orchestrator | Tuesday 06 January 2026 00:47:36 +0000 (0:00:00.727) 0:02:19.190 ******* 2026-01-06 00:49:48.229288 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.229294 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.229300 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.229306 | orchestrator | 2026-01-06 00:49:48.229312 | orchestrator | TASK [k3s_agent : Copy K3s http_proxy conf file] ******************************* 2026-01-06 00:49:48.229318 | orchestrator | Tuesday 06 January 2026 00:47:36 +0000 (0:00:00.306) 0:02:19.496 ******* 2026-01-06 00:49:48.229324 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.229330 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.229336 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.229342 | orchestrator | 2026-01-06 00:49:48.229348 | orchestrator | TASK [k3s_agent : Deploy K3s http_proxy conf] ********************************** 2026-01-06 00:49:48.229354 | orchestrator | Tuesday 06 January 2026 00:47:36 +0000 (0:00:00.321) 0:02:19.818 ******* 2026-01-06 00:49:48.229360 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.229365 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.229371 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.229376 | orchestrator | 2026-01-06 00:49:48.229381 | orchestrator | TASK [k3s_agent : Create /etc/rancher/k3s directory] *************************** 2026-01-06 00:49:48.229387 | orchestrator | Tuesday 06 January 2026 00:47:36 +0000 (0:00:00.305) 0:02:20.123 ******* 2026-01-06 00:49:48.229393 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.229399 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.229404 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.229410 | orchestrator | 2026-01-06 00:49:48.229425 | orchestrator | TASK [k3s_agent : Create custom resolv.conf for k3s] *************************** 2026-01-06 00:49:48.229432 | orchestrator | Tuesday 06 January 2026 00:47:37 +0000 (0:00:00.942) 0:02:21.066 ******* 2026-01-06 00:49:48.229438 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.229444 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.229450 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.229456 | orchestrator | 2026-01-06 00:49:48.229462 | orchestrator | TASK [k3s_agent : Configure the k3s service] *********************************** 2026-01-06 00:49:48.229468 | orchestrator | Tuesday 06 January 2026 00:47:39 +0000 (0:00:01.366) 0:02:22.432 ******* 2026-01-06 00:49:48.229473 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.229479 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.229485 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.229491 | orchestrator | 2026-01-06 00:49:48.229497 | orchestrator | TASK [k3s_agent : Manage k3s service] ****************************************** 2026-01-06 00:49:48.229503 | orchestrator | Tuesday 06 January 2026 00:47:40 +0000 (0:00:01.434) 0:02:23.866 ******* 2026-01-06 00:49:48.229507 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:49:48.229511 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:49:48.229515 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:49:48.229518 | orchestrator | 2026-01-06 00:49:48.229522 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-01-06 00:49:48.229526 | orchestrator | 2026-01-06 00:49:48.229531 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-01-06 00:49:48.229538 | orchestrator | Tuesday 06 January 2026 00:47:51 +0000 (0:00:10.870) 0:02:34.737 ******* 2026-01-06 00:49:48.229543 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229557 | orchestrator | 2026-01-06 00:49:48.229563 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-01-06 00:49:48.229568 | orchestrator | Tuesday 06 January 2026 00:47:52 +0000 (0:00:00.891) 0:02:35.629 ******* 2026-01-06 00:49:48.229575 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229581 | orchestrator | 2026-01-06 00:49:48.229587 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-01-06 00:49:48.229594 | orchestrator | Tuesday 06 January 2026 00:47:53 +0000 (0:00:00.504) 0:02:36.133 ******* 2026-01-06 00:49:48.229600 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-01-06 00:49:48.229606 | orchestrator | 2026-01-06 00:49:48.229612 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-01-06 00:49:48.229618 | orchestrator | Tuesday 06 January 2026 00:47:53 +0000 (0:00:00.577) 0:02:36.710 ******* 2026-01-06 00:49:48.229625 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229631 | orchestrator | 2026-01-06 00:49:48.229638 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-01-06 00:49:48.229644 | orchestrator | Tuesday 06 January 2026 00:47:54 +0000 (0:00:00.877) 0:02:37.588 ******* 2026-01-06 00:49:48.229650 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229656 | orchestrator | 2026-01-06 00:49:48.229662 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-01-06 00:49:48.229668 | orchestrator | Tuesday 06 January 2026 00:47:55 +0000 (0:00:00.553) 0:02:38.141 ******* 2026-01-06 00:49:48.229674 | orchestrator | changed: [testbed-manager -> localhost] 2026-01-06 00:49:48.229681 | orchestrator | 2026-01-06 00:49:48.229687 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-01-06 00:49:48.229693 | orchestrator | Tuesday 06 January 2026 00:47:56 +0000 (0:00:01.600) 0:02:39.742 ******* 2026-01-06 00:49:48.229699 | orchestrator | changed: [testbed-manager -> localhost] 2026-01-06 00:49:48.229704 | orchestrator | 2026-01-06 00:49:48.229710 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-01-06 00:49:48.229717 | orchestrator | Tuesday 06 January 2026 00:47:57 +0000 (0:00:00.826) 0:02:40.568 ******* 2026-01-06 00:49:48.229723 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229729 | orchestrator | 2026-01-06 00:49:48.229736 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-01-06 00:49:48.229742 | orchestrator | Tuesday 06 January 2026 00:47:58 +0000 (0:00:00.679) 0:02:41.247 ******* 2026-01-06 00:49:48.229748 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229754 | orchestrator | 2026-01-06 00:49:48.229761 | orchestrator | PLAY [Apply role kubectl] ****************************************************** 2026-01-06 00:49:48.229767 | orchestrator | 2026-01-06 00:49:48.229774 | orchestrator | TASK [kubectl : Gather variables for each operating system] ******************** 2026-01-06 00:49:48.229780 | orchestrator | Tuesday 06 January 2026 00:47:58 +0000 (0:00:00.502) 0:02:41.750 ******* 2026-01-06 00:49:48.229787 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229792 | orchestrator | 2026-01-06 00:49:48.229798 | orchestrator | TASK [kubectl : Include distribution specific install tasks] ******************* 2026-01-06 00:49:48.229804 | orchestrator | Tuesday 06 January 2026 00:47:58 +0000 (0:00:00.187) 0:02:41.938 ******* 2026-01-06 00:49:48.229811 | orchestrator | included: /ansible/roles/kubectl/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:49:48.229817 | orchestrator | 2026-01-06 00:49:48.229823 | orchestrator | TASK [kubectl : Remove old architecture-dependent repository] ****************** 2026-01-06 00:49:48.229829 | orchestrator | Tuesday 06 January 2026 00:47:59 +0000 (0:00:00.256) 0:02:42.194 ******* 2026-01-06 00:49:48.229835 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229841 | orchestrator | 2026-01-06 00:49:48.229847 | orchestrator | TASK [kubectl : Install apt-transport-https package] *************************** 2026-01-06 00:49:48.229852 | orchestrator | Tuesday 06 January 2026 00:47:59 +0000 (0:00:00.851) 0:02:43.046 ******* 2026-01-06 00:49:48.229858 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229864 | orchestrator | 2026-01-06 00:49:48.229870 | orchestrator | TASK [kubectl : Add repository gpg key] **************************************** 2026-01-06 00:49:48.229882 | orchestrator | Tuesday 06 January 2026 00:48:01 +0000 (0:00:02.046) 0:02:45.092 ******* 2026-01-06 00:49:48.229888 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229894 | orchestrator | 2026-01-06 00:49:48.229900 | orchestrator | TASK [kubectl : Set permissions of gpg key] ************************************ 2026-01-06 00:49:48.229906 | orchestrator | Tuesday 06 January 2026 00:48:03 +0000 (0:00:01.192) 0:02:46.285 ******* 2026-01-06 00:49:48.229913 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229919 | orchestrator | 2026-01-06 00:49:48.229931 | orchestrator | TASK [kubectl : Add repository Debian] ***************************************** 2026-01-06 00:49:48.229937 | orchestrator | Tuesday 06 January 2026 00:48:03 +0000 (0:00:00.437) 0:02:46.722 ******* 2026-01-06 00:49:48.229944 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229949 | orchestrator | 2026-01-06 00:49:48.229955 | orchestrator | TASK [kubectl : Install required packages] ************************************* 2026-01-06 00:49:48.229962 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:07.176) 0:02:53.898 ******* 2026-01-06 00:49:48.229969 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.229974 | orchestrator | 2026-01-06 00:49:48.229981 | orchestrator | TASK [kubectl : Remove kubectl symlink] **************************************** 2026-01-06 00:49:48.229987 | orchestrator | Tuesday 06 January 2026 00:48:23 +0000 (0:00:12.986) 0:03:06.885 ******* 2026-01-06 00:49:48.229993 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.229999 | orchestrator | 2026-01-06 00:49:48.230005 | orchestrator | PLAY [Run post actions on master nodes] **************************************** 2026-01-06 00:49:48.230011 | orchestrator | 2026-01-06 00:49:48.230250 | orchestrator | TASK [k3s_server_post : Validating arguments against arg spec 'main' - Configure k3s cluster] *** 2026-01-06 00:49:48.230302 | orchestrator | Tuesday 06 January 2026 00:48:24 +0000 (0:00:00.551) 0:03:07.436 ******* 2026-01-06 00:49:48.230372 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.230388 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.230395 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.230401 | orchestrator | 2026-01-06 00:49:48.230408 | orchestrator | TASK [k3s_server_post : Deploy calico] ***************************************** 2026-01-06 00:49:48.230415 | orchestrator | Tuesday 06 January 2026 00:48:24 +0000 (0:00:00.318) 0:03:07.754 ******* 2026-01-06 00:49:48.230421 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230428 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.230434 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.230440 | orchestrator | 2026-01-06 00:49:48.230447 | orchestrator | TASK [k3s_server_post : Deploy cilium] ***************************************** 2026-01-06 00:49:48.230453 | orchestrator | Tuesday 06 January 2026 00:48:24 +0000 (0:00:00.256) 0:03:08.010 ******* 2026-01-06 00:49:48.230459 | orchestrator | included: /ansible/roles/k3s_server_post/tasks/cilium.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:49:48.230466 | orchestrator | 2026-01-06 00:49:48.230472 | orchestrator | TASK [k3s_server_post : Create tmp directory on first master] ****************** 2026-01-06 00:49:48.230478 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:00.754) 0:03:08.765 ******* 2026-01-06 00:49:48.230485 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230491 | orchestrator | 2026-01-06 00:49:48.230497 | orchestrator | TASK [k3s_server_post : Wait for connectivity to kube VIP] ********************* 2026-01-06 00:49:48.230504 | orchestrator | Tuesday 06 January 2026 00:48:26 +0000 (0:00:00.860) 0:03:09.625 ******* 2026-01-06 00:49:48.230510 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230516 | orchestrator | 2026-01-06 00:49:48.230523 | orchestrator | TASK [k3s_server_post : Fail if kube VIP not reachable] ************************ 2026-01-06 00:49:48.230529 | orchestrator | Tuesday 06 January 2026 00:48:27 +0000 (0:00:00.708) 0:03:10.333 ******* 2026-01-06 00:49:48.230536 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230542 | orchestrator | 2026-01-06 00:49:48.230548 | orchestrator | TASK [k3s_server_post : Test for existing Cilium install] ********************** 2026-01-06 00:49:48.230554 | orchestrator | Tuesday 06 January 2026 00:48:27 +0000 (0:00:00.312) 0:03:10.646 ******* 2026-01-06 00:49:48.230571 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230577 | orchestrator | 2026-01-06 00:49:48.230583 | orchestrator | TASK [k3s_server_post : Check Cilium version] ********************************** 2026-01-06 00:49:48.230589 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.793) 0:03:11.440 ******* 2026-01-06 00:49:48.230598 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230604 | orchestrator | 2026-01-06 00:49:48.230610 | orchestrator | TASK [k3s_server_post : Parse installed Cilium version] ************************ 2026-01-06 00:49:48.230617 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.133) 0:03:11.573 ******* 2026-01-06 00:49:48.230622 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230627 | orchestrator | 2026-01-06 00:49:48.230633 | orchestrator | TASK [k3s_server_post : Determine if Cilium needs update] ********************** 2026-01-06 00:49:48.230638 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.106) 0:03:11.680 ******* 2026-01-06 00:49:48.230644 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230650 | orchestrator | 2026-01-06 00:49:48.230656 | orchestrator | TASK [k3s_server_post : Log result] ******************************************** 2026-01-06 00:49:48.230662 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.102) 0:03:11.783 ******* 2026-01-06 00:49:48.230668 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230673 | orchestrator | 2026-01-06 00:49:48.230679 | orchestrator | TASK [k3s_server_post : Install Cilium] **************************************** 2026-01-06 00:49:48.230685 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.105) 0:03:11.889 ******* 2026-01-06 00:49:48.230691 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230696 | orchestrator | 2026-01-06 00:49:48.230722 | orchestrator | TASK [k3s_server_post : Wait for Cilium resources] ***************************** 2026-01-06 00:49:48.230730 | orchestrator | Tuesday 06 January 2026 00:48:33 +0000 (0:00:04.333) 0:03:16.222 ******* 2026-01-06 00:49:48.230736 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/cilium-operator) 2026-01-06 00:49:48.230743 | orchestrator | FAILED - RETRYING: [testbed-node-0 -> localhost]: Wait for Cilium resources (30 retries left). 2026-01-06 00:49:48.230749 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=daemonset/cilium) 2026-01-06 00:49:48.230762 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-relay) 2026-01-06 00:49:48.230768 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-ui) 2026-01-06 00:49:48.230773 | orchestrator | 2026-01-06 00:49:48.230779 | orchestrator | TASK [k3s_server_post : Set _cilium_bgp_neighbors fact] ************************ 2026-01-06 00:49:48.230785 | orchestrator | Tuesday 06 January 2026 00:49:18 +0000 (0:00:45.272) 0:04:01.495 ******* 2026-01-06 00:49:48.230803 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230809 | orchestrator | 2026-01-06 00:49:48.230815 | orchestrator | TASK [k3s_server_post : Copy BGP manifests to first master] ******************** 2026-01-06 00:49:48.230821 | orchestrator | Tuesday 06 January 2026 00:49:19 +0000 (0:00:01.356) 0:04:02.852 ******* 2026-01-06 00:49:48.230827 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230834 | orchestrator | 2026-01-06 00:49:48.230840 | orchestrator | TASK [k3s_server_post : Apply BGP manifests] *********************************** 2026-01-06 00:49:48.230846 | orchestrator | Tuesday 06 January 2026 00:49:21 +0000 (0:00:01.643) 0:04:04.495 ******* 2026-01-06 00:49:48.230853 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-01-06 00:49:48.230859 | orchestrator | 2026-01-06 00:49:48.230866 | orchestrator | TASK [k3s_server_post : Print error message if BGP manifests application fails] *** 2026-01-06 00:49:48.230872 | orchestrator | Tuesday 06 January 2026 00:49:22 +0000 (0:00:01.202) 0:04:05.698 ******* 2026-01-06 00:49:48.230879 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230885 | orchestrator | 2026-01-06 00:49:48.230891 | orchestrator | TASK [k3s_server_post : Test for BGP config resources] ************************* 2026-01-06 00:49:48.230897 | orchestrator | Tuesday 06 January 2026 00:49:22 +0000 (0:00:00.135) 0:04:05.833 ******* 2026-01-06 00:49:48.230912 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumBGPPeeringPolicy.cilium.io) 2026-01-06 00:49:48.230918 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumLoadBalancerIPPool.cilium.io) 2026-01-06 00:49:48.230924 | orchestrator | 2026-01-06 00:49:48.230929 | orchestrator | TASK [k3s_server_post : Deploy metallb pool] *********************************** 2026-01-06 00:49:48.230935 | orchestrator | Tuesday 06 January 2026 00:49:24 +0000 (0:00:01.906) 0:04:07.740 ******* 2026-01-06 00:49:48.230941 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.230947 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.230952 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.230958 | orchestrator | 2026-01-06 00:49:48.230964 | orchestrator | TASK [k3s_server_post : Remove tmp directory used for manifests] *************** 2026-01-06 00:49:48.230969 | orchestrator | Tuesday 06 January 2026 00:49:24 +0000 (0:00:00.370) 0:04:08.110 ******* 2026-01-06 00:49:48.230976 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.230982 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.230987 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.230993 | orchestrator | 2026-01-06 00:49:48.230998 | orchestrator | PLAY [Apply role k9s] ********************************************************** 2026-01-06 00:49:48.231004 | orchestrator | 2026-01-06 00:49:48.231009 | orchestrator | TASK [k9s : Gather variables for each operating system] ************************ 2026-01-06 00:49:48.231016 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:01.022) 0:04:09.132 ******* 2026-01-06 00:49:48.231022 | orchestrator | ok: [testbed-manager] 2026-01-06 00:49:48.231027 | orchestrator | 2026-01-06 00:49:48.231033 | orchestrator | TASK [k9s : Include distribution specific install tasks] *********************** 2026-01-06 00:49:48.231040 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:00.118) 0:04:09.251 ******* 2026-01-06 00:49:48.231046 | orchestrator | included: /ansible/roles/k9s/tasks/install-Debian-family.yml for testbed-manager 2026-01-06 00:49:48.231052 | orchestrator | 2026-01-06 00:49:48.231057 | orchestrator | TASK [k9s : Install k9s packages] ********************************************** 2026-01-06 00:49:48.231063 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:00.211) 0:04:09.462 ******* 2026-01-06 00:49:48.231069 | orchestrator | changed: [testbed-manager] 2026-01-06 00:49:48.231075 | orchestrator | 2026-01-06 00:49:48.231080 | orchestrator | PLAY [Manage labels, annotations, and taints on all k3s nodes] ***************** 2026-01-06 00:49:48.231086 | orchestrator | 2026-01-06 00:49:48.231092 | orchestrator | TASK [Merge labels, annotations, and taints] *********************************** 2026-01-06 00:49:48.231103 | orchestrator | Tuesday 06 January 2026 00:49:31 +0000 (0:00:05.173) 0:04:14.636 ******* 2026-01-06 00:49:48.231109 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:49:48.231116 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:49:48.231123 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:49:48.231130 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:49:48.231136 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:49:48.231238 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:49:48.231245 | orchestrator | 2026-01-06 00:49:48.231251 | orchestrator | TASK [Manage labels] *********************************************************** 2026-01-06 00:49:48.231258 | orchestrator | Tuesday 06 January 2026 00:49:32 +0000 (0:00:01.065) 0:04:15.701 ******* 2026-01-06 00:49:48.231264 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-01-06 00:49:48.231270 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-01-06 00:49:48.231275 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-01-06 00:49:48.231282 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-01-06 00:49:48.231289 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-01-06 00:49:48.231296 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-01-06 00:49:48.231303 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-01-06 00:49:48.231319 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=openstack-control-plane=enabled) 2026-01-06 00:49:48.231326 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-01-06 00:49:48.231333 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=openstack-control-plane=enabled) 2026-01-06 00:49:48.231340 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-01-06 00:49:48.231347 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-01-06 00:49:48.231363 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-01-06 00:49:48.231370 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-01-06 00:49:48.231376 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=openstack-control-plane=enabled) 2026-01-06 00:49:48.231382 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-01-06 00:49:48.231389 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-01-06 00:49:48.231395 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-01-06 00:49:48.231402 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-01-06 00:49:48.231408 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-01-06 00:49:48.231415 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-01-06 00:49:48.231422 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-01-06 00:49:48.231429 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-01-06 00:49:48.231434 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-01-06 00:49:48.231440 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-01-06 00:49:48.231447 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-01-06 00:49:48.231453 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-01-06 00:49:48.231460 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-01-06 00:49:48.231467 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-01-06 00:49:48.231473 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-01-06 00:49:48.231479 | orchestrator | 2026-01-06 00:49:48.231485 | orchestrator | TASK [Manage annotations] ****************************************************** 2026-01-06 00:49:48.231491 | orchestrator | Tuesday 06 January 2026 00:49:44 +0000 (0:00:11.864) 0:04:27.566 ******* 2026-01-06 00:49:48.231497 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.231504 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.231510 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.231516 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.231523 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.231529 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.231536 | orchestrator | 2026-01-06 00:49:48.231542 | orchestrator | TASK [Manage taints] *********************************************************** 2026-01-06 00:49:48.231549 | orchestrator | Tuesday 06 January 2026 00:49:45 +0000 (0:00:00.949) 0:04:28.516 ******* 2026-01-06 00:49:48.231555 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:49:48.231561 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:49:48.231567 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:49:48.231574 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:49:48.231582 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:49:48.231589 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:49:48.231603 | orchestrator | 2026-01-06 00:49:48.231611 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:49:48.231623 | orchestrator | testbed-manager : ok=21  changed=11  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:49:48.231632 | orchestrator | testbed-node-0 : ok=50  changed=23  unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-01-06 00:49:48.231639 | orchestrator | testbed-node-1 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-01-06 00:49:48.231646 | orchestrator | testbed-node-2 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-01-06 00:49:48.231653 | orchestrator | testbed-node-3 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-01-06 00:49:48.231660 | orchestrator | testbed-node-4 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-01-06 00:49:48.231666 | orchestrator | testbed-node-5 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-01-06 00:49:48.231672 | orchestrator | 2026-01-06 00:49:48.231678 | orchestrator | 2026-01-06 00:49:48.231684 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:49:48.231690 | orchestrator | Tuesday 06 January 2026 00:49:45 +0000 (0:00:00.479) 0:04:28.996 ******* 2026-01-06 00:49:48.231697 | orchestrator | =============================================================================== 2026-01-06 00:49:48.231704 | orchestrator | k3s_server_post : Wait for Cilium resources ---------------------------- 45.27s 2026-01-06 00:49:48.231711 | orchestrator | k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails) -- 44.11s 2026-01-06 00:49:48.231719 | orchestrator | k3s_server : Enable and check K3s service ------------------------------ 22.64s 2026-01-06 00:49:48.231732 | orchestrator | kubectl : Install required packages ------------------------------------ 12.99s 2026-01-06 00:49:48.231739 | orchestrator | Manage labels ---------------------------------------------------------- 11.86s 2026-01-06 00:49:48.231746 | orchestrator | k3s_agent : Manage k3s service ----------------------------------------- 10.87s 2026-01-06 00:49:48.231753 | orchestrator | kubectl : Add repository Debian ----------------------------------------- 7.18s 2026-01-06 00:49:48.231760 | orchestrator | k3s_download : Download k3s binary x64 ---------------------------------- 6.09s 2026-01-06 00:49:48.231767 | orchestrator | k9s : Install k9s packages ---------------------------------------------- 5.17s 2026-01-06 00:49:48.231773 | orchestrator | k3s_server_post : Install Cilium ---------------------------------------- 4.33s 2026-01-06 00:49:48.231779 | orchestrator | k3s_server : Set _kube_vip_bgp_peers fact ------------------------------- 3.98s 2026-01-06 00:49:48.231785 | orchestrator | k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start --- 3.09s 2026-01-06 00:49:48.231791 | orchestrator | k3s_server : Detect Kubernetes version for label compatibility ---------- 3.02s 2026-01-06 00:49:48.231797 | orchestrator | k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers --- 2.52s 2026-01-06 00:49:48.231803 | orchestrator | k3s_download : Download k3s binary armhf -------------------------------- 2.30s 2026-01-06 00:49:48.231809 | orchestrator | k3s_server : Copy vip manifest to first master -------------------------- 2.12s 2026-01-06 00:49:48.231815 | orchestrator | kubectl : Install apt-transport-https package --------------------------- 2.05s 2026-01-06 00:49:48.231821 | orchestrator | k3s_prereq : Enable IPv4 forwarding ------------------------------------- 1.96s 2026-01-06 00:49:48.231826 | orchestrator | k3s_server_post : Test for BGP config resources ------------------------- 1.91s 2026-01-06 00:49:48.231832 | orchestrator | k3s_custom_registries : Create directory /etc/rancher/k3s --------------- 1.89s 2026-01-06 00:49:48.231843 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:48.231850 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:48.231857 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task b6ac1578-6a4e-4956-98ea-c9a333b9c877 is in state STARTED 2026-01-06 00:49:48.231863 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:48.231983 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task 2f9b4341-1c07-408b-a4a0-48c316459066 is in state STARTED 2026-01-06 00:49:48.231994 | orchestrator | 2026-01-06 00:49:48 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:48.232001 | orchestrator | 2026-01-06 00:49:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:51.264446 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:51.264585 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:51.266209 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task b6ac1578-6a4e-4956-98ea-c9a333b9c877 is in state STARTED 2026-01-06 00:49:51.266801 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:51.267527 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task 2f9b4341-1c07-408b-a4a0-48c316459066 is in state STARTED 2026-01-06 00:49:51.268449 | orchestrator | 2026-01-06 00:49:51 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:51.268599 | orchestrator | 2026-01-06 00:49:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:54.300593 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:54.300868 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:54.303653 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task b6ac1578-6a4e-4956-98ea-c9a333b9c877 is in state STARTED 2026-01-06 00:49:54.304840 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:54.304893 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task 2f9b4341-1c07-408b-a4a0-48c316459066 is in state SUCCESS 2026-01-06 00:49:54.306518 | orchestrator | 2026-01-06 00:49:54 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:54.306549 | orchestrator | 2026-01-06 00:49:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:49:57.340702 | orchestrator | 2026-01-06 00:49:57 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:49:57.342287 | orchestrator | 2026-01-06 00:49:57 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:49:57.344344 | orchestrator | 2026-01-06 00:49:57 | INFO  | Task b6ac1578-6a4e-4956-98ea-c9a333b9c877 is in state STARTED 2026-01-06 00:49:57.346482 | orchestrator | 2026-01-06 00:49:57 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:49:57.349012 | orchestrator | 2026-01-06 00:49:57 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:49:57.349050 | orchestrator | 2026-01-06 00:49:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:00.394956 | orchestrator | 2026-01-06 00:50:00 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:00.399341 | orchestrator | 2026-01-06 00:50:00 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:00.399412 | orchestrator | 2026-01-06 00:50:00 | INFO  | Task b6ac1578-6a4e-4956-98ea-c9a333b9c877 is in state SUCCESS 2026-01-06 00:50:00.401751 | orchestrator | 2026-01-06 00:50:00 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:00.403778 | orchestrator | 2026-01-06 00:50:00 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:00.403815 | orchestrator | 2026-01-06 00:50:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:03.456740 | orchestrator | 2026-01-06 00:50:03 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:03.458251 | orchestrator | 2026-01-06 00:50:03 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:03.460060 | orchestrator | 2026-01-06 00:50:03 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:03.461840 | orchestrator | 2026-01-06 00:50:03 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:03.461868 | orchestrator | 2026-01-06 00:50:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:06.506319 | orchestrator | 2026-01-06 00:50:06 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:06.507416 | orchestrator | 2026-01-06 00:50:06 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:06.507532 | orchestrator | 2026-01-06 00:50:06 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:06.507773 | orchestrator | 2026-01-06 00:50:06 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:06.507799 | orchestrator | 2026-01-06 00:50:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:09.555769 | orchestrator | 2026-01-06 00:50:09 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:09.556387 | orchestrator | 2026-01-06 00:50:09 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:09.559232 | orchestrator | 2026-01-06 00:50:09 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:09.559277 | orchestrator | 2026-01-06 00:50:09 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:09.559290 | orchestrator | 2026-01-06 00:50:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:12.622298 | orchestrator | 2026-01-06 00:50:12 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:12.624620 | orchestrator | 2026-01-06 00:50:12 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:12.627404 | orchestrator | 2026-01-06 00:50:12 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:12.629516 | orchestrator | 2026-01-06 00:50:12 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:12.629567 | orchestrator | 2026-01-06 00:50:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:15.683869 | orchestrator | 2026-01-06 00:50:15 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:15.684180 | orchestrator | 2026-01-06 00:50:15 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:15.685563 | orchestrator | 2026-01-06 00:50:15 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:15.688131 | orchestrator | 2026-01-06 00:50:15 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:15.688239 | orchestrator | 2026-01-06 00:50:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:18.720756 | orchestrator | 2026-01-06 00:50:18 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:18.721407 | orchestrator | 2026-01-06 00:50:18 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:18.722406 | orchestrator | 2026-01-06 00:50:18 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:18.724627 | orchestrator | 2026-01-06 00:50:18 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:18.724668 | orchestrator | 2026-01-06 00:50:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:21.753573 | orchestrator | 2026-01-06 00:50:21 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:21.755280 | orchestrator | 2026-01-06 00:50:21 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:21.755981 | orchestrator | 2026-01-06 00:50:21 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:21.756748 | orchestrator | 2026-01-06 00:50:21 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:21.756784 | orchestrator | 2026-01-06 00:50:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:24.787227 | orchestrator | 2026-01-06 00:50:24 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:24.787591 | orchestrator | 2026-01-06 00:50:24 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:24.788637 | orchestrator | 2026-01-06 00:50:24 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:24.789293 | orchestrator | 2026-01-06 00:50:24 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:24.789333 | orchestrator | 2026-01-06 00:50:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:27.837305 | orchestrator | 2026-01-06 00:50:27 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:27.838945 | orchestrator | 2026-01-06 00:50:27 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:27.839374 | orchestrator | 2026-01-06 00:50:27 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:27.842165 | orchestrator | 2026-01-06 00:50:27 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:27.842255 | orchestrator | 2026-01-06 00:50:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:30.879233 | orchestrator | 2026-01-06 00:50:30 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:30.880789 | orchestrator | 2026-01-06 00:50:30 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:30.881815 | orchestrator | 2026-01-06 00:50:30 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:30.883205 | orchestrator | 2026-01-06 00:50:30 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:30.883320 | orchestrator | 2026-01-06 00:50:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:33.922387 | orchestrator | 2026-01-06 00:50:33 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:33.922744 | orchestrator | 2026-01-06 00:50:33 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:33.925541 | orchestrator | 2026-01-06 00:50:33 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:33.925838 | orchestrator | 2026-01-06 00:50:33 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:33.925865 | orchestrator | 2026-01-06 00:50:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:36.968873 | orchestrator | 2026-01-06 00:50:36 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:36.969480 | orchestrator | 2026-01-06 00:50:36 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:36.971917 | orchestrator | 2026-01-06 00:50:36 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:36.972817 | orchestrator | 2026-01-06 00:50:36 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:36.973017 | orchestrator | 2026-01-06 00:50:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:40.024798 | orchestrator | 2026-01-06 00:50:40 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:40.025710 | orchestrator | 2026-01-06 00:50:40 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:40.026704 | orchestrator | 2026-01-06 00:50:40 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:40.028018 | orchestrator | 2026-01-06 00:50:40 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:40.028312 | orchestrator | 2026-01-06 00:50:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:43.074926 | orchestrator | 2026-01-06 00:50:43 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:43.076828 | orchestrator | 2026-01-06 00:50:43 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:43.079660 | orchestrator | 2026-01-06 00:50:43 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:43.081296 | orchestrator | 2026-01-06 00:50:43 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:43.081389 | orchestrator | 2026-01-06 00:50:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:46.148340 | orchestrator | 2026-01-06 00:50:46 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:46.149477 | orchestrator | 2026-01-06 00:50:46 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:46.149609 | orchestrator | 2026-01-06 00:50:46 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:46.151490 | orchestrator | 2026-01-06 00:50:46 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:46.151517 | orchestrator | 2026-01-06 00:50:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:49.198314 | orchestrator | 2026-01-06 00:50:49 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:49.198451 | orchestrator | 2026-01-06 00:50:49 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:49.199306 | orchestrator | 2026-01-06 00:50:49 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:49.202220 | orchestrator | 2026-01-06 00:50:49 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:49.202254 | orchestrator | 2026-01-06 00:50:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:52.244423 | orchestrator | 2026-01-06 00:50:52 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:52.245469 | orchestrator | 2026-01-06 00:50:52 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:52.245605 | orchestrator | 2026-01-06 00:50:52 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:52.246587 | orchestrator | 2026-01-06 00:50:52 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:52.246617 | orchestrator | 2026-01-06 00:50:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:55.339434 | orchestrator | 2026-01-06 00:50:55 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:55.339555 | orchestrator | 2026-01-06 00:50:55 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:55.339909 | orchestrator | 2026-01-06 00:50:55 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:55.347299 | orchestrator | 2026-01-06 00:50:55 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:55.347376 | orchestrator | 2026-01-06 00:50:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:50:58.374922 | orchestrator | 2026-01-06 00:50:58 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:50:58.375025 | orchestrator | 2026-01-06 00:50:58 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:50:58.375632 | orchestrator | 2026-01-06 00:50:58 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:50:58.376162 | orchestrator | 2026-01-06 00:50:58 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:50:58.376185 | orchestrator | 2026-01-06 00:50:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:01.441806 | orchestrator | 2026-01-06 00:51:01 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:01.443878 | orchestrator | 2026-01-06 00:51:01 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:01.445909 | orchestrator | 2026-01-06 00:51:01 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:01.447487 | orchestrator | 2026-01-06 00:51:01 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:01.447679 | orchestrator | 2026-01-06 00:51:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:04.486985 | orchestrator | 2026-01-06 00:51:04 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:04.488252 | orchestrator | 2026-01-06 00:51:04 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:04.489653 | orchestrator | 2026-01-06 00:51:04 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:04.492393 | orchestrator | 2026-01-06 00:51:04 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:04.492458 | orchestrator | 2026-01-06 00:51:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:07.525810 | orchestrator | 2026-01-06 00:51:07 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:07.525999 | orchestrator | 2026-01-06 00:51:07 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:07.527488 | orchestrator | 2026-01-06 00:51:07 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:07.528207 | orchestrator | 2026-01-06 00:51:07 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:07.528267 | orchestrator | 2026-01-06 00:51:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:10.552881 | orchestrator | 2026-01-06 00:51:10 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:10.553239 | orchestrator | 2026-01-06 00:51:10 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:10.555915 | orchestrator | 2026-01-06 00:51:10 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:10.556431 | orchestrator | 2026-01-06 00:51:10 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:10.556459 | orchestrator | 2026-01-06 00:51:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:13.585579 | orchestrator | 2026-01-06 00:51:13 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:13.587791 | orchestrator | 2026-01-06 00:51:13 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:13.587878 | orchestrator | 2026-01-06 00:51:13 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:13.587925 | orchestrator | 2026-01-06 00:51:13 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:13.587972 | orchestrator | 2026-01-06 00:51:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:16.628275 | orchestrator | 2026-01-06 00:51:16 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:16.628374 | orchestrator | 2026-01-06 00:51:16 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:16.629207 | orchestrator | 2026-01-06 00:51:16 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:16.630331 | orchestrator | 2026-01-06 00:51:16 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:16.630357 | orchestrator | 2026-01-06 00:51:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:19.680827 | orchestrator | 2026-01-06 00:51:19 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:19.682951 | orchestrator | 2026-01-06 00:51:19 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:19.683015 | orchestrator | 2026-01-06 00:51:19 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:19.690792 | orchestrator | 2026-01-06 00:51:19 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:19.690853 | orchestrator | 2026-01-06 00:51:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:22.799467 | orchestrator | 2026-01-06 00:51:22 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:22.800150 | orchestrator | 2026-01-06 00:51:22 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:22.800682 | orchestrator | 2026-01-06 00:51:22 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:22.801575 | orchestrator | 2026-01-06 00:51:22 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:22.801638 | orchestrator | 2026-01-06 00:51:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:25.836959 | orchestrator | 2026-01-06 00:51:25 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:25.838211 | orchestrator | 2026-01-06 00:51:25 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:25.838990 | orchestrator | 2026-01-06 00:51:25 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:25.839763 | orchestrator | 2026-01-06 00:51:25 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:25.841464 | orchestrator | 2026-01-06 00:51:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:28.887221 | orchestrator | 2026-01-06 00:51:28 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:28.889176 | orchestrator | 2026-01-06 00:51:28 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:28.891565 | orchestrator | 2026-01-06 00:51:28 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:28.893631 | orchestrator | 2026-01-06 00:51:28 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:28.893753 | orchestrator | 2026-01-06 00:51:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:31.946587 | orchestrator | 2026-01-06 00:51:31 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:31.946796 | orchestrator | 2026-01-06 00:51:31 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:31.947000 | orchestrator | 2026-01-06 00:51:31 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:31.947996 | orchestrator | 2026-01-06 00:51:31 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:31.948123 | orchestrator | 2026-01-06 00:51:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:34.992744 | orchestrator | 2026-01-06 00:51:34 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:34.994812 | orchestrator | 2026-01-06 00:51:34 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:34.996770 | orchestrator | 2026-01-06 00:51:34 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:34.997268 | orchestrator | 2026-01-06 00:51:34 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:34.997586 | orchestrator | 2026-01-06 00:51:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:38.047570 | orchestrator | 2026-01-06 00:51:38 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:38.050200 | orchestrator | 2026-01-06 00:51:38 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:38.052912 | orchestrator | 2026-01-06 00:51:38 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:38.055299 | orchestrator | 2026-01-06 00:51:38 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:38.055482 | orchestrator | 2026-01-06 00:51:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:41.110743 | orchestrator | 2026-01-06 00:51:41 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:41.114702 | orchestrator | 2026-01-06 00:51:41 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:41.117769 | orchestrator | 2026-01-06 00:51:41 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:41.120664 | orchestrator | 2026-01-06 00:51:41 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:41.121262 | orchestrator | 2026-01-06 00:51:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:44.162274 | orchestrator | 2026-01-06 00:51:44 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:44.162917 | orchestrator | 2026-01-06 00:51:44 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:44.164877 | orchestrator | 2026-01-06 00:51:44 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:44.165824 | orchestrator | 2026-01-06 00:51:44 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:44.165909 | orchestrator | 2026-01-06 00:51:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:47.203120 | orchestrator | 2026-01-06 00:51:47 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:47.203650 | orchestrator | 2026-01-06 00:51:47 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:47.206691 | orchestrator | 2026-01-06 00:51:47 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:47.207470 | orchestrator | 2026-01-06 00:51:47 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state STARTED 2026-01-06 00:51:47.207545 | orchestrator | 2026-01-06 00:51:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:50.239739 | orchestrator | 2026-01-06 00:51:50 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:50.240574 | orchestrator | 2026-01-06 00:51:50 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:50.241488 | orchestrator | 2026-01-06 00:51:50 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:50.244938 | orchestrator | 2026-01-06 00:51:50.244978 | orchestrator | 2026-01-06 00:51:50.244986 | orchestrator | PLAY [Copy kubeconfig to the configuration repository] ************************* 2026-01-06 00:51:50.245047 | orchestrator | 2026-01-06 00:51:50.245054 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-01-06 00:51:50.245061 | orchestrator | Tuesday 06 January 2026 00:49:50 +0000 (0:00:00.454) 0:00:00.454 ******* 2026-01-06 00:51:50.245068 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-01-06 00:51:50.245075 | orchestrator | 2026-01-06 00:51:50.245082 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-01-06 00:51:50.245089 | orchestrator | Tuesday 06 January 2026 00:49:51 +0000 (0:00:00.746) 0:00:01.200 ******* 2026-01-06 00:51:50.245095 | orchestrator | changed: [testbed-manager] 2026-01-06 00:51:50.245102 | orchestrator | 2026-01-06 00:51:50.245110 | orchestrator | TASK [Change server address in the kubeconfig file] **************************** 2026-01-06 00:51:50.245118 | orchestrator | Tuesday 06 January 2026 00:49:52 +0000 (0:00:01.141) 0:00:02.342 ******* 2026-01-06 00:51:50.245125 | orchestrator | changed: [testbed-manager] 2026-01-06 00:51:50.245152 | orchestrator | 2026-01-06 00:51:50.245160 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:51:50.245166 | orchestrator | testbed-manager : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:51:50.245174 | orchestrator | 2026-01-06 00:51:50.245181 | orchestrator | 2026-01-06 00:51:50.245187 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:51:50.245194 | orchestrator | Tuesday 06 January 2026 00:49:53 +0000 (0:00:00.419) 0:00:02.762 ******* 2026-01-06 00:51:50.245201 | orchestrator | =============================================================================== 2026-01-06 00:51:50.245207 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.14s 2026-01-06 00:51:50.245213 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.75s 2026-01-06 00:51:50.245232 | orchestrator | Change server address in the kubeconfig file ---------------------------- 0.42s 2026-01-06 00:51:50.245238 | orchestrator | 2026-01-06 00:51:50.245244 | orchestrator | 2026-01-06 00:51:50.245269 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-01-06 00:51:50.245276 | orchestrator | 2026-01-06 00:51:50.245282 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-01-06 00:51:50.245289 | orchestrator | Tuesday 06 January 2026 00:49:50 +0000 (0:00:00.160) 0:00:00.160 ******* 2026-01-06 00:51:50.245295 | orchestrator | ok: [testbed-manager] 2026-01-06 00:51:50.245302 | orchestrator | 2026-01-06 00:51:50.245309 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-01-06 00:51:50.245315 | orchestrator | Tuesday 06 January 2026 00:49:51 +0000 (0:00:00.551) 0:00:00.711 ******* 2026-01-06 00:51:50.245321 | orchestrator | ok: [testbed-manager] 2026-01-06 00:51:50.245327 | orchestrator | 2026-01-06 00:51:50.245335 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-01-06 00:51:50.245341 | orchestrator | Tuesday 06 January 2026 00:49:51 +0000 (0:00:00.538) 0:00:01.250 ******* 2026-01-06 00:51:50.245348 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-01-06 00:51:50.245355 | orchestrator | 2026-01-06 00:51:50.245361 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-01-06 00:51:50.245368 | orchestrator | Tuesday 06 January 2026 00:49:52 +0000 (0:00:00.721) 0:00:01.971 ******* 2026-01-06 00:51:50.245375 | orchestrator | changed: [testbed-manager] 2026-01-06 00:51:50.245381 | orchestrator | 2026-01-06 00:51:50.245388 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-01-06 00:51:50.245394 | orchestrator | Tuesday 06 January 2026 00:49:53 +0000 (0:00:01.188) 0:00:03.160 ******* 2026-01-06 00:51:50.245401 | orchestrator | changed: [testbed-manager] 2026-01-06 00:51:50.245407 | orchestrator | 2026-01-06 00:51:50.245414 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-01-06 00:51:50.245420 | orchestrator | Tuesday 06 January 2026 00:49:54 +0000 (0:00:00.551) 0:00:03.712 ******* 2026-01-06 00:51:50.245427 | orchestrator | changed: [testbed-manager -> localhost] 2026-01-06 00:51:50.245433 | orchestrator | 2026-01-06 00:51:50.245439 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-01-06 00:51:50.245445 | orchestrator | Tuesday 06 January 2026 00:49:55 +0000 (0:00:01.502) 0:00:05.214 ******* 2026-01-06 00:51:50.245451 | orchestrator | changed: [testbed-manager -> localhost] 2026-01-06 00:51:50.245457 | orchestrator | 2026-01-06 00:51:50.245463 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-01-06 00:51:50.245469 | orchestrator | Tuesday 06 January 2026 00:49:56 +0000 (0:00:00.816) 0:00:06.030 ******* 2026-01-06 00:51:50.245475 | orchestrator | ok: [testbed-manager] 2026-01-06 00:51:50.245481 | orchestrator | 2026-01-06 00:51:50.245487 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-01-06 00:51:50.245493 | orchestrator | Tuesday 06 January 2026 00:49:56 +0000 (0:00:00.390) 0:00:06.421 ******* 2026-01-06 00:51:50.245499 | orchestrator | ok: [testbed-manager] 2026-01-06 00:51:50.245506 | orchestrator | 2026-01-06 00:51:50.245512 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:51:50.245519 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 00:51:50.245525 | orchestrator | 2026-01-06 00:51:50.245531 | orchestrator | 2026-01-06 00:51:50.245538 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:51:50.245545 | orchestrator | Tuesday 06 January 2026 00:49:57 +0000 (0:00:00.313) 0:00:06.734 ******* 2026-01-06 00:51:50.245552 | orchestrator | =============================================================================== 2026-01-06 00:51:50.245558 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.50s 2026-01-06 00:51:50.245565 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.19s 2026-01-06 00:51:50.245571 | orchestrator | Change server address in the kubeconfig inside the manager service ------ 0.82s 2026-01-06 00:51:50.245588 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.72s 2026-01-06 00:51:50.245602 | orchestrator | Change server address in the kubeconfig --------------------------------- 0.55s 2026-01-06 00:51:50.245608 | orchestrator | Get home directory of operator user ------------------------------------- 0.55s 2026-01-06 00:51:50.245615 | orchestrator | Create .kube directory -------------------------------------------------- 0.54s 2026-01-06 00:51:50.245622 | orchestrator | Set KUBECONFIG environment variable ------------------------------------- 0.39s 2026-01-06 00:51:50.245628 | orchestrator | Enable kubectl command line completion ---------------------------------- 0.31s 2026-01-06 00:51:50.245635 | orchestrator | 2026-01-06 00:51:50.245641 | orchestrator | 2026-01-06 00:51:50.245648 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2026-01-06 00:51:50.245654 | orchestrator | 2026-01-06 00:51:50.245661 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-01-06 00:51:50.245667 | orchestrator | Tuesday 06 January 2026 00:48:33 +0000 (0:00:00.214) 0:00:00.214 ******* 2026-01-06 00:51:50.245674 | orchestrator | ok: [localhost] => { 2026-01-06 00:51:50.245681 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2026-01-06 00:51:50.245688 | orchestrator | } 2026-01-06 00:51:50.245695 | orchestrator | 2026-01-06 00:51:50.245702 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2026-01-06 00:51:50.245709 | orchestrator | Tuesday 06 January 2026 00:48:34 +0000 (0:00:00.118) 0:00:00.332 ******* 2026-01-06 00:51:50.245716 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2026-01-06 00:51:50.245724 | orchestrator | ...ignoring 2026-01-06 00:51:50.245731 | orchestrator | 2026-01-06 00:51:50.245738 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2026-01-06 00:51:50.245749 | orchestrator | Tuesday 06 January 2026 00:48:37 +0000 (0:00:03.258) 0:00:03.590 ******* 2026-01-06 00:51:50.245755 | orchestrator | skipping: [localhost] 2026-01-06 00:51:50.245762 | orchestrator | 2026-01-06 00:51:50.245769 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2026-01-06 00:51:50.245775 | orchestrator | Tuesday 06 January 2026 00:48:37 +0000 (0:00:00.072) 0:00:03.662 ******* 2026-01-06 00:51:50.245782 | orchestrator | ok: [localhost] 2026-01-06 00:51:50.245789 | orchestrator | 2026-01-06 00:51:50.245795 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:51:50.245801 | orchestrator | 2026-01-06 00:51:50.245807 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:51:50.245814 | orchestrator | Tuesday 06 January 2026 00:48:37 +0000 (0:00:00.364) 0:00:04.028 ******* 2026-01-06 00:51:50.245820 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:51:50.245827 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:51:50.245833 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:51:50.245840 | orchestrator | 2026-01-06 00:51:50.245847 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:51:50.245853 | orchestrator | Tuesday 06 January 2026 00:48:38 +0000 (0:00:00.317) 0:00:04.345 ******* 2026-01-06 00:51:50.245860 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2026-01-06 00:51:50.245867 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2026-01-06 00:51:50.245874 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2026-01-06 00:51:50.245880 | orchestrator | 2026-01-06 00:51:50.245887 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2026-01-06 00:51:50.245894 | orchestrator | 2026-01-06 00:51:50.245900 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-01-06 00:51:50.245906 | orchestrator | Tuesday 06 January 2026 00:48:38 +0000 (0:00:00.656) 0:00:05.002 ******* 2026-01-06 00:51:50.245912 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:51:50.245919 | orchestrator | 2026-01-06 00:51:50.245925 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-01-06 00:51:50.245941 | orchestrator | Tuesday 06 January 2026 00:48:39 +0000 (0:00:00.510) 0:00:05.513 ******* 2026-01-06 00:51:50.245947 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:51:50.245954 | orchestrator | 2026-01-06 00:51:50.245960 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2026-01-06 00:51:50.245966 | orchestrator | Tuesday 06 January 2026 00:48:40 +0000 (0:00:01.039) 0:00:06.552 ******* 2026-01-06 00:51:50.245971 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.245978 | orchestrator | 2026-01-06 00:51:50.245984 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2026-01-06 00:51:50.246000 | orchestrator | Tuesday 06 January 2026 00:48:40 +0000 (0:00:00.257) 0:00:06.810 ******* 2026-01-06 00:51:50.246006 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246045 | orchestrator | 2026-01-06 00:51:50.246053 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2026-01-06 00:51:50.246060 | orchestrator | Tuesday 06 January 2026 00:48:40 +0000 (0:00:00.253) 0:00:07.064 ******* 2026-01-06 00:51:50.246066 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246073 | orchestrator | 2026-01-06 00:51:50.246079 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2026-01-06 00:51:50.246086 | orchestrator | Tuesday 06 January 2026 00:48:41 +0000 (0:00:00.264) 0:00:07.328 ******* 2026-01-06 00:51:50.246093 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246100 | orchestrator | 2026-01-06 00:51:50.246106 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-01-06 00:51:50.246113 | orchestrator | Tuesday 06 January 2026 00:48:41 +0000 (0:00:00.839) 0:00:08.167 ******* 2026-01-06 00:51:50.246120 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:51:50.246126 | orchestrator | 2026-01-06 00:51:50.246133 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-01-06 00:51:50.246145 | orchestrator | Tuesday 06 January 2026 00:48:42 +0000 (0:00:00.776) 0:00:08.944 ******* 2026-01-06 00:51:50.246152 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:51:50.246159 | orchestrator | 2026-01-06 00:51:50.246165 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2026-01-06 00:51:50.246172 | orchestrator | Tuesday 06 January 2026 00:48:43 +0000 (0:00:00.809) 0:00:09.754 ******* 2026-01-06 00:51:50.246178 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246185 | orchestrator | 2026-01-06 00:51:50.246191 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2026-01-06 00:51:50.246198 | orchestrator | Tuesday 06 January 2026 00:48:43 +0000 (0:00:00.451) 0:00:10.206 ******* 2026-01-06 00:51:50.246205 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246211 | orchestrator | 2026-01-06 00:51:50.246218 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2026-01-06 00:51:50.246224 | orchestrator | Tuesday 06 January 2026 00:48:44 +0000 (0:00:00.769) 0:00:10.975 ******* 2026-01-06 00:51:50.246237 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246252 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246260 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246268 | orchestrator | 2026-01-06 00:51:50.246275 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2026-01-06 00:51:50.246281 | orchestrator | Tuesday 06 January 2026 00:48:46 +0000 (0:00:01.646) 0:00:12.621 ******* 2026-01-06 00:51:50.246293 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246303 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246316 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246323 | orchestrator | 2026-01-06 00:51:50.246330 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2026-01-06 00:51:50.246337 | orchestrator | Tuesday 06 January 2026 00:48:48 +0000 (0:00:02.576) 0:00:15.198 ******* 2026-01-06 00:51:50.246344 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-01-06 00:51:50.246350 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-01-06 00:51:50.246357 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-01-06 00:51:50.246363 | orchestrator | 2026-01-06 00:51:50.246370 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2026-01-06 00:51:50.246377 | orchestrator | Tuesday 06 January 2026 00:48:50 +0000 (0:00:01.690) 0:00:16.889 ******* 2026-01-06 00:51:50.246383 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-01-06 00:51:50.246390 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-01-06 00:51:50.246397 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-01-06 00:51:50.246403 | orchestrator | 2026-01-06 00:51:50.246410 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2026-01-06 00:51:50.246419 | orchestrator | Tuesday 06 January 2026 00:48:52 +0000 (0:00:01.610) 0:00:18.500 ******* 2026-01-06 00:51:50.246426 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-01-06 00:51:50.246433 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-01-06 00:51:50.246439 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-01-06 00:51:50.246446 | orchestrator | 2026-01-06 00:51:50.246453 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2026-01-06 00:51:50.246459 | orchestrator | Tuesday 06 January 2026 00:48:53 +0000 (0:00:01.493) 0:00:19.994 ******* 2026-01-06 00:51:50.246466 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-01-06 00:51:50.246473 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-01-06 00:51:50.246479 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-01-06 00:51:50.246490 | orchestrator | 2026-01-06 00:51:50.246496 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2026-01-06 00:51:50.246503 | orchestrator | Tuesday 06 January 2026 00:48:55 +0000 (0:00:01.732) 0:00:21.726 ******* 2026-01-06 00:51:50.246509 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-01-06 00:51:50.246516 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-01-06 00:51:50.246523 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-01-06 00:51:50.246529 | orchestrator | 2026-01-06 00:51:50.246536 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2026-01-06 00:51:50.246543 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:01.491) 0:00:23.218 ******* 2026-01-06 00:51:50.246549 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-01-06 00:51:50.246556 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-01-06 00:51:50.246563 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-01-06 00:51:50.246569 | orchestrator | 2026-01-06 00:51:50.246576 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-01-06 00:51:50.246582 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:02.109) 0:00:25.328 ******* 2026-01-06 00:51:50.246589 | orchestrator | included: /ansible/roles/rabbitmq/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:51:50.246596 | orchestrator | 2026-01-06 00:51:50.246603 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over extra CA certificates] ******* 2026-01-06 00:51:50.246609 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:00.842) 0:00:26.170 ******* 2026-01-06 00:51:50.246616 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246627 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246659 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246667 | orchestrator | 2026-01-06 00:51:50.246674 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS certificate] *** 2026-01-06 00:51:50.246680 | orchestrator | Tuesday 06 January 2026 00:49:01 +0000 (0:00:01.516) 0:00:27.686 ******* 2026-01-06 00:51:50.246689 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246700 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246707 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246714 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:51:50.246725 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246736 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:51:50.246743 | orchestrator | 2026-01-06 00:51:50.246750 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS key] **** 2026-01-06 00:51:50.246772 | orchestrator | Tuesday 06 January 2026 00:49:01 +0000 (0:00:00.434) 0:00:28.121 ******* 2026-01-06 00:51:50.246782 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246789 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.246797 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246804 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:51:50.246811 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.246823 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:51:50.246831 | orchestrator | 2026-01-06 00:51:50.246837 | orchestrator | TASK [service-check-containers : rabbitmq | Check containers] ****************** 2026-01-06 00:51:50.246848 | orchestrator | Tuesday 06 January 2026 00:49:02 +0000 (0:00:01.124) 0:00:29.245 ******* 2026-01-06 00:51:50.246856 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246866 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246875 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:51:50.246883 | orchestrator | 2026-01-06 00:51:50.246891 | orchestrator | TASK [service-check-containers : rabbitmq | Notify handlers to restart containers] *** 2026-01-06 00:51:50.246898 | orchestrator | Tuesday 06 January 2026 00:49:04 +0000 (0:00:01.412) 0:00:30.658 ******* 2026-01-06 00:51:50.246910 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:51:50.246919 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:51:50.246926 | orchestrator | } 2026-01-06 00:51:50.246933 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:51:50.246940 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:51:50.246948 | orchestrator | } 2026-01-06 00:51:50.246954 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:51:50.246960 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:51:50.246967 | orchestrator | } 2026-01-06 00:51:50.246974 | orchestrator | 2026-01-06 00:51:50.246980 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:51:50.246999 | orchestrator | Tuesday 06 January 2026 00:49:04 +0000 (0:00:00.488) 0:00:31.147 ******* 2026-01-06 00:51:50.247019 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'K2026-01-06 00:51:50 | INFO  | Task 23ce9a0e-a53f-40dd-8c36-b62d4bb5074a is in state SUCCESS 2026-01-06 00:51:50.247027 | orchestrator | 2026-01-06 00:51:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:50.247035 | orchestrator | OLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.247047 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.247055 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.247062 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:51:50.247069 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:51:50.247085 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:51:50.247092 | orchestrator | 2026-01-06 00:51:50.247099 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2026-01-06 00:51:50.247106 | orchestrator | Tuesday 06 January 2026 00:49:05 +0000 (0:00:01.030) 0:00:32.177 ******* 2026-01-06 00:51:50.247113 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:51:50.247120 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:51:50.247127 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:51:50.247133 | orchestrator | 2026-01-06 00:51:50.247140 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2026-01-06 00:51:50.247147 | orchestrator | Tuesday 06 January 2026 00:49:07 +0000 (0:00:01.155) 0:00:33.332 ******* 2026-01-06 00:51:50.247154 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:51:50.247161 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:51:50.247168 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:51:50.247175 | orchestrator | 2026-01-06 00:51:50.247182 | orchestrator | RUNNING HANDLER [rabbitmq : Restart rabbitmq container] ************************ 2026-01-06 00:51:50.247189 | orchestrator | Tuesday 06 January 2026 00:49:16 +0000 (0:00:09.006) 0:00:42.339 ******* 2026-01-06 00:51:50.247196 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:51:50.247204 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:51:50.247211 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:51:50.247217 | orchestrator | 2026-01-06 00:51:50.247228 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2026-01-06 00:51:50.247236 | orchestrator | 2026-01-06 00:51:50.247243 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2026-01-06 00:51:50.247249 | orchestrator | Tuesday 06 January 2026 00:49:16 +0000 (0:00:00.950) 0:00:43.290 ******* 2026-01-06 00:51:50.247256 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:51:50.247263 | orchestrator | 2026-01-06 00:51:50.247270 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2026-01-06 00:51:50.247276 | orchestrator | Tuesday 06 January 2026 00:49:17 +0000 (0:00:00.787) 0:00:44.077 ******* 2026-01-06 00:51:50.247283 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:51:50.247290 | orchestrator | 2026-01-06 00:51:50.247297 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2026-01-06 00:51:50.247304 | orchestrator | Tuesday 06 January 2026 00:49:18 +0000 (0:00:00.318) 0:00:44.397 ******* 2026-01-06 00:51:50.247312 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:51:50.247318 | orchestrator | 2026-01-06 00:51:50.247325 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2026-01-06 00:51:50.247332 | orchestrator | Tuesday 06 January 2026 00:49:20 +0000 (0:00:02.200) 0:00:46.597 ******* 2026-01-06 00:51:50.247339 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:51:50.247346 | orchestrator | 2026-01-06 00:51:50.247352 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2026-01-06 00:51:50.247360 | orchestrator | 2026-01-06 00:51:50.247367 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2026-01-06 00:51:50.247375 | orchestrator | Tuesday 06 January 2026 00:51:13 +0000 (0:01:53.583) 0:02:40.180 ******* 2026-01-06 00:51:50.247382 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:51:50.247389 | orchestrator | 2026-01-06 00:51:50.247396 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2026-01-06 00:51:50.247403 | orchestrator | Tuesday 06 January 2026 00:51:14 +0000 (0:00:00.780) 0:02:40.961 ******* 2026-01-06 00:51:50.247410 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:51:50.247417 | orchestrator | 2026-01-06 00:51:50.247429 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2026-01-06 00:51:50.247442 | orchestrator | Tuesday 06 January 2026 00:51:14 +0000 (0:00:00.129) 0:02:41.090 ******* 2026-01-06 00:51:50.247449 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:51:50.247456 | orchestrator | 2026-01-06 00:51:50.247463 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2026-01-06 00:51:50.247469 | orchestrator | Tuesday 06 January 2026 00:51:16 +0000 (0:00:01.737) 0:02:42.828 ******* 2026-01-06 00:51:50.247476 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:51:50.247483 | orchestrator | 2026-01-06 00:51:50.247490 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2026-01-06 00:51:50.247497 | orchestrator | 2026-01-06 00:51:50.247504 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2026-01-06 00:51:50.247511 | orchestrator | Tuesday 06 January 2026 00:51:31 +0000 (0:00:14.524) 0:02:57.352 ******* 2026-01-06 00:51:50.247518 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:51:50.247540 | orchestrator | 2026-01-06 00:51:50.247548 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2026-01-06 00:51:50.247556 | orchestrator | Tuesday 06 January 2026 00:51:31 +0000 (0:00:00.729) 0:02:58.082 ******* 2026-01-06 00:51:50.247562 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:51:50.247569 | orchestrator | 2026-01-06 00:51:50.247576 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2026-01-06 00:51:50.247583 | orchestrator | Tuesday 06 January 2026 00:51:31 +0000 (0:00:00.130) 0:02:58.212 ******* 2026-01-06 00:51:50.247590 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:51:50.247597 | orchestrator | 2026-01-06 00:51:50.247604 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2026-01-06 00:51:50.247612 | orchestrator | Tuesday 06 January 2026 00:51:33 +0000 (0:00:01.665) 0:02:59.878 ******* 2026-01-06 00:51:50.247619 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:51:50.247626 | orchestrator | 2026-01-06 00:51:50.247633 | orchestrator | PLAY [Apply rabbitmq post-configuration] *************************************** 2026-01-06 00:51:50.247640 | orchestrator | 2026-01-06 00:51:50.247647 | orchestrator | TASK [Include rabbitmq post-deploy.yml] **************************************** 2026-01-06 00:51:50.247655 | orchestrator | Tuesday 06 January 2026 00:51:43 +0000 (0:00:09.885) 0:03:09.763 ******* 2026-01-06 00:51:50.247662 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:51:50.247669 | orchestrator | 2026-01-06 00:51:50.247676 | orchestrator | TASK [rabbitmq : Enable all stable feature flags] ****************************** 2026-01-06 00:51:50.247682 | orchestrator | Tuesday 06 January 2026 00:51:43 +0000 (0:00:00.529) 0:03:10.293 ******* 2026-01-06 00:51:50.247689 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:51:50.247696 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:51:50.247703 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:51:50.247709 | orchestrator | 2026-01-06 00:51:50.247717 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:51:50.247724 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-01-06 00:51:50.247731 | orchestrator | testbed-node-0 : ok=26  changed=16  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 00:51:50.247738 | orchestrator | testbed-node-1 : ok=24  changed=16  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:51:50.247746 | orchestrator | testbed-node-2 : ok=24  changed=16  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:51:50.247753 | orchestrator | 2026-01-06 00:51:50.247760 | orchestrator | 2026-01-06 00:51:50.247767 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:51:50.247773 | orchestrator | Tuesday 06 January 2026 00:51:47 +0000 (0:00:03.109) 0:03:13.403 ******* 2026-01-06 00:51:50.247786 | orchestrator | =============================================================================== 2026-01-06 00:51:50.247799 | orchestrator | rabbitmq : Waiting for rabbitmq to start ------------------------------ 137.99s 2026-01-06 00:51:50.247806 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 9.01s 2026-01-06 00:51:50.247813 | orchestrator | rabbitmq : Restart rabbitmq container ----------------------------------- 5.60s 2026-01-06 00:51:50.247820 | orchestrator | Check RabbitMQ service -------------------------------------------------- 3.26s 2026-01-06 00:51:50.247827 | orchestrator | rabbitmq : Enable all stable feature flags ------------------------------ 3.11s 2026-01-06 00:51:50.247834 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 2.58s 2026-01-06 00:51:50.247841 | orchestrator | rabbitmq : Get info on RabbitMQ container ------------------------------- 2.30s 2026-01-06 00:51:50.247848 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 2.11s 2026-01-06 00:51:50.247855 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 1.73s 2026-01-06 00:51:50.247862 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 1.69s 2026-01-06 00:51:50.247869 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 1.65s 2026-01-06 00:51:50.247876 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 1.61s 2026-01-06 00:51:50.247883 | orchestrator | service-cert-copy : rabbitmq | Copying over extra CA certificates ------- 1.52s 2026-01-06 00:51:50.247890 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 1.49s 2026-01-06 00:51:50.247898 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.49s 2026-01-06 00:51:50.247905 | orchestrator | service-check-containers : rabbitmq | Check containers ------------------ 1.41s 2026-01-06 00:51:50.247915 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 1.16s 2026-01-06 00:51:50.247923 | orchestrator | service-cert-copy : rabbitmq | Copying over backend internal TLS key ---- 1.12s 2026-01-06 00:51:50.247930 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.04s 2026-01-06 00:51:50.247937 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.03s 2026-01-06 00:51:53.284618 | orchestrator | 2026-01-06 00:51:53 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:53.286804 | orchestrator | 2026-01-06 00:51:53 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:53.288848 | orchestrator | 2026-01-06 00:51:53 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:53.288918 | orchestrator | 2026-01-06 00:51:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:56.330402 | orchestrator | 2026-01-06 00:51:56 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:56.332222 | orchestrator | 2026-01-06 00:51:56 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:56.332273 | orchestrator | 2026-01-06 00:51:56 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:56.332293 | orchestrator | 2026-01-06 00:51:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:51:59.373306 | orchestrator | 2026-01-06 00:51:59 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:51:59.374522 | orchestrator | 2026-01-06 00:51:59 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:51:59.377595 | orchestrator | 2026-01-06 00:51:59 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:51:59.377663 | orchestrator | 2026-01-06 00:51:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:02.438162 | orchestrator | 2026-01-06 00:52:02 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:02.438307 | orchestrator | 2026-01-06 00:52:02 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:02.439067 | orchestrator | 2026-01-06 00:52:02 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:02.439101 | orchestrator | 2026-01-06 00:52:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:05.474399 | orchestrator | 2026-01-06 00:52:05 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:05.475181 | orchestrator | 2026-01-06 00:52:05 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:05.476279 | orchestrator | 2026-01-06 00:52:05 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:05.476316 | orchestrator | 2026-01-06 00:52:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:08.518137 | orchestrator | 2026-01-06 00:52:08 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:08.519521 | orchestrator | 2026-01-06 00:52:08 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:08.519551 | orchestrator | 2026-01-06 00:52:08 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:08.519561 | orchestrator | 2026-01-06 00:52:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:11.564681 | orchestrator | 2026-01-06 00:52:11 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:11.566456 | orchestrator | 2026-01-06 00:52:11 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:11.568571 | orchestrator | 2026-01-06 00:52:11 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:11.568622 | orchestrator | 2026-01-06 00:52:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:14.609081 | orchestrator | 2026-01-06 00:52:14 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:14.609183 | orchestrator | 2026-01-06 00:52:14 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:14.609198 | orchestrator | 2026-01-06 00:52:14 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:14.609210 | orchestrator | 2026-01-06 00:52:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:17.646173 | orchestrator | 2026-01-06 00:52:17 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:17.646278 | orchestrator | 2026-01-06 00:52:17 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:17.646812 | orchestrator | 2026-01-06 00:52:17 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:17.646849 | orchestrator | 2026-01-06 00:52:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:20.681117 | orchestrator | 2026-01-06 00:52:20 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:20.682866 | orchestrator | 2026-01-06 00:52:20 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:20.685165 | orchestrator | 2026-01-06 00:52:20 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:20.685513 | orchestrator | 2026-01-06 00:52:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:23.726600 | orchestrator | 2026-01-06 00:52:23 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:23.727443 | orchestrator | 2026-01-06 00:52:23 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:23.728860 | orchestrator | 2026-01-06 00:52:23 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:23.728887 | orchestrator | 2026-01-06 00:52:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:26.795815 | orchestrator | 2026-01-06 00:52:26 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:26.798836 | orchestrator | 2026-01-06 00:52:26 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:26.799411 | orchestrator | 2026-01-06 00:52:26 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:26.799437 | orchestrator | 2026-01-06 00:52:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:29.823807 | orchestrator | 2026-01-06 00:52:29 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:29.825427 | orchestrator | 2026-01-06 00:52:29 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:29.826358 | orchestrator | 2026-01-06 00:52:29 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:29.826410 | orchestrator | 2026-01-06 00:52:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:32.873355 | orchestrator | 2026-01-06 00:52:32 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:32.874546 | orchestrator | 2026-01-06 00:52:32 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:32.876575 | orchestrator | 2026-01-06 00:52:32 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:32.876669 | orchestrator | 2026-01-06 00:52:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:35.959989 | orchestrator | 2026-01-06 00:52:35 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:35.962389 | orchestrator | 2026-01-06 00:52:35 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:35.964582 | orchestrator | 2026-01-06 00:52:35 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:35.964639 | orchestrator | 2026-01-06 00:52:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:39.059492 | orchestrator | 2026-01-06 00:52:39 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:39.060115 | orchestrator | 2026-01-06 00:52:39 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:39.060887 | orchestrator | 2026-01-06 00:52:39 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:39.060985 | orchestrator | 2026-01-06 00:52:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:42.093438 | orchestrator | 2026-01-06 00:52:42 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:42.096141 | orchestrator | 2026-01-06 00:52:42 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:42.098214 | orchestrator | 2026-01-06 00:52:42 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:42.098258 | orchestrator | 2026-01-06 00:52:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:45.122779 | orchestrator | 2026-01-06 00:52:45 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:45.123087 | orchestrator | 2026-01-06 00:52:45 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:45.124093 | orchestrator | 2026-01-06 00:52:45 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:45.124111 | orchestrator | 2026-01-06 00:52:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:48.161485 | orchestrator | 2026-01-06 00:52:48 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:48.162138 | orchestrator | 2026-01-06 00:52:48 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:48.163732 | orchestrator | 2026-01-06 00:52:48 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:48.163800 | orchestrator | 2026-01-06 00:52:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:51.204075 | orchestrator | 2026-01-06 00:52:51 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:51.205982 | orchestrator | 2026-01-06 00:52:51 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:51.207804 | orchestrator | 2026-01-06 00:52:51 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:51.207850 | orchestrator | 2026-01-06 00:52:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:54.239588 | orchestrator | 2026-01-06 00:52:54 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:54.239983 | orchestrator | 2026-01-06 00:52:54 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:54.242221 | orchestrator | 2026-01-06 00:52:54 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:54.242269 | orchestrator | 2026-01-06 00:52:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:52:57.290112 | orchestrator | 2026-01-06 00:52:57 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:52:57.292058 | orchestrator | 2026-01-06 00:52:57 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state STARTED 2026-01-06 00:52:57.294808 | orchestrator | 2026-01-06 00:52:57 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:52:57.295081 | orchestrator | 2026-01-06 00:52:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:00.342825 | orchestrator | 2026-01-06 00:53:00 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:00.347088 | orchestrator | 2026-01-06 00:53:00 | INFO  | Task b6b044b2-8593-47d8-a46e-06634610c596 is in state SUCCESS 2026-01-06 00:53:00.349955 | orchestrator | 2026-01-06 00:53:00.350005 | orchestrator | 2026-01-06 00:53:00.350051 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:53:00.350063 | orchestrator | 2026-01-06 00:53:00.350072 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:53:00.350080 | orchestrator | Tuesday 06 January 2026 00:49:27 +0000 (0:00:00.139) 0:00:00.139 ******* 2026-01-06 00:53:00.350088 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.350097 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.350105 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.350112 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:53:00.350120 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:53:00.350127 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:53:00.350135 | orchestrator | 2026-01-06 00:53:00.350142 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:53:00.350150 | orchestrator | Tuesday 06 January 2026 00:49:27 +0000 (0:00:00.559) 0:00:00.698 ******* 2026-01-06 00:53:00.350158 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2026-01-06 00:53:00.350166 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2026-01-06 00:53:00.350173 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2026-01-06 00:53:00.350257 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2026-01-06 00:53:00.350266 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2026-01-06 00:53:00.350273 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2026-01-06 00:53:00.350280 | orchestrator | 2026-01-06 00:53:00.350288 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2026-01-06 00:53:00.350295 | orchestrator | 2026-01-06 00:53:00.350303 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2026-01-06 00:53:00.350310 | orchestrator | Tuesday 06 January 2026 00:49:28 +0000 (0:00:00.833) 0:00:01.532 ******* 2026-01-06 00:53:00.350319 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:53:00.350328 | orchestrator | 2026-01-06 00:53:00.350335 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2026-01-06 00:53:00.350342 | orchestrator | Tuesday 06 January 2026 00:49:29 +0000 (0:00:01.093) 0:00:02.626 ******* 2026-01-06 00:53:00.350366 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350377 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350385 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350392 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350405 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350420 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350437 | orchestrator | 2026-01-06 00:53:00.350465 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2026-01-06 00:53:00.350478 | orchestrator | Tuesday 06 January 2026 00:49:31 +0000 (0:00:01.874) 0:00:04.500 ******* 2026-01-06 00:53:00.350501 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350514 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350527 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350545 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350559 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350571 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350583 | orchestrator | 2026-01-06 00:53:00.350595 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2026-01-06 00:53:00.350609 | orchestrator | Tuesday 06 January 2026 00:49:33 +0000 (0:00:02.427) 0:00:06.928 ******* 2026-01-06 00:53:00.350621 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350634 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350655 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350678 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350691 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350705 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350718 | orchestrator | 2026-01-06 00:53:00.350732 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2026-01-06 00:53:00.350744 | orchestrator | Tuesday 06 January 2026 00:49:34 +0000 (0:00:01.085) 0:00:08.014 ******* 2026-01-06 00:53:00.350764 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350778 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350792 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350927 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350937 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350953 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350961 | orchestrator | 2026-01-06 00:53:00.350975 | orchestrator | TASK [service-check-containers : ovn_controller | Check containers] ************ 2026-01-06 00:53:00.350983 | orchestrator | Tuesday 06 January 2026 00:49:36 +0000 (0:00:02.104) 0:00:10.118 ******* 2026-01-06 00:53:00.350991 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.350998 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.351006 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.351017 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.351025 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.351032 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.351040 | orchestrator | 2026-01-06 00:53:00.351047 | orchestrator | TASK [service-check-containers : ovn_controller | Notify handlers to restart containers] *** 2026-01-06 00:53:00.351055 | orchestrator | Tuesday 06 January 2026 00:49:38 +0000 (0:00:01.710) 0:00:11.828 ******* 2026-01-06 00:53:00.351063 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:53:00.351070 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351078 | orchestrator | } 2026-01-06 00:53:00.351085 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:53:00.351092 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351104 | orchestrator | } 2026-01-06 00:53:00.351111 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:53:00.351119 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351126 | orchestrator | } 2026-01-06 00:53:00.351133 | orchestrator | changed: [testbed-node-3] => { 2026-01-06 00:53:00.351140 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351148 | orchestrator | } 2026-01-06 00:53:00.351155 | orchestrator | changed: [testbed-node-4] => { 2026-01-06 00:53:00.351162 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351169 | orchestrator | } 2026-01-06 00:53:00.351176 | orchestrator | changed: [testbed-node-5] => { 2026-01-06 00:53:00.351183 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.351191 | orchestrator | } 2026-01-06 00:53:00.351198 | orchestrator | 2026-01-06 00:53:00.351205 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:53:00.351212 | orchestrator | Tuesday 06 January 2026 00:49:39 +0000 (0:00:00.542) 0:00:12.371 ******* 2026-01-06 00:53:00.351220 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351227 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.351239 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351247 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351255 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.351262 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.351270 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351281 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351289 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:53:00.351296 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:53:00.351303 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2025.1', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.351323 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:53:00.351331 | orchestrator | 2026-01-06 00:53:00.351338 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2026-01-06 00:53:00.351345 | orchestrator | Tuesday 06 January 2026 00:49:40 +0000 (0:00:01.192) 0:00:13.563 ******* 2026-01-06 00:53:00.351352 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.351360 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.351367 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:53:00.351374 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:53:00.351381 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.351388 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:53:00.351395 | orchestrator | 2026-01-06 00:53:00.351403 | orchestrator | TASK [ovn-controller : Configure OVN in OVSDB] ********************************* 2026-01-06 00:53:00.351410 | orchestrator | Tuesday 06 January 2026 00:49:43 +0000 (0:00:02.855) 0:00:16.418 ******* 2026-01-06 00:53:00.351417 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.11'}) 2026-01-06 00:53:00.351424 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.12'}) 2026-01-06 00:53:00.351431 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.13'}) 2026-01-06 00:53:00.351438 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.10'}) 2026-01-06 00:53:00.351445 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.15'}) 2026-01-06 00:53:00.351452 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.14'}) 2026-01-06 00:53:00.351460 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351467 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351474 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351481 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351488 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351495 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2026-01-06 00:53:00.351508 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351516 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351523 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351531 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351538 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351545 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:16641,tcp:192.168.16.11:16641,tcp:192.168.16.12:16641'}) 2026-01-06 00:53:00.351552 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351561 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351568 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351575 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351589 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351596 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2026-01-06 00:53:00.351604 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351611 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351622 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351629 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351636 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351644 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2026-01-06 00:53:00.351651 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351658 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351665 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351673 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351680 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351687 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-monitor-all', 'value': False}) 2026-01-06 00:53:00.351694 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2026-01-06 00:53:00.351702 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2026-01-06 00:53:00.351709 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2026-01-06 00:53:00.351716 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2026-01-06 00:53:00.351732 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2026-01-06 00:53:00.351740 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2026-01-06 00:53:00.351747 | orchestrator | ok: [testbed-node-0] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:52:c1:40', 'state': 'absent'}) 2026-01-06 00:53:00.351756 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:89:18:56', 'state': 'present'}) 2026-01-06 00:53:00.351764 | orchestrator | ok: [testbed-node-2] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:29:4a:9b', 'state': 'absent'}) 2026-01-06 00:53:00.351776 | orchestrator | ok: [testbed-node-1] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:33:12:50', 'state': 'absent'}) 2026-01-06 00:53:00.351788 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:71:3a:c3', 'state': 'present'}) 2026-01-06 00:53:00.351806 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:2f:fa:44', 'state': 'present'}) 2026-01-06 00:53:00.351818 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2026-01-06 00:53:00.351829 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2026-01-06 00:53:00.351841 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2026-01-06 00:53:00.351860 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2026-01-06 00:53:00.351896 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2026-01-06 00:53:00.351911 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2026-01-06 00:53:00.351923 | orchestrator | 2026-01-06 00:53:00.351934 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.351946 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:20.795) 0:00:37.214 ******* 2026-01-06 00:53:00.351959 | orchestrator | 2026-01-06 00:53:00.351971 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.351983 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.068) 0:00:37.282 ******* 2026-01-06 00:53:00.351996 | orchestrator | 2026-01-06 00:53:00.352007 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.352019 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.065) 0:00:37.347 ******* 2026-01-06 00:53:00.352031 | orchestrator | 2026-01-06 00:53:00.352043 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.352055 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.061) 0:00:37.409 ******* 2026-01-06 00:53:00.352068 | orchestrator | 2026-01-06 00:53:00.352080 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.352092 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.065) 0:00:37.474 ******* 2026-01-06 00:53:00.352102 | orchestrator | 2026-01-06 00:53:00.352115 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2026-01-06 00:53:00.352128 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.063) 0:00:37.537 ******* 2026-01-06 00:53:00.352140 | orchestrator | 2026-01-06 00:53:00.352152 | orchestrator | RUNNING HANDLER [ovn-controller : Reload systemd config] *********************** 2026-01-06 00:53:00.352164 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:00.068) 0:00:37.605 ******* 2026-01-06 00:53:00.352175 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:53:00.352186 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:53:00.352199 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:53:00.352211 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352223 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352235 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352248 | orchestrator | 2026-01-06 00:53:00.352260 | orchestrator | RUNNING HANDLER [ovn-controller : Restart ovn-controller container] ************ 2026-01-06 00:53:00.352273 | orchestrator | Tuesday 06 January 2026 00:50:06 +0000 (0:00:02.190) 0:00:39.796 ******* 2026-01-06 00:53:00.352285 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.352298 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.352310 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:53:00.352321 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.352329 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:53:00.352336 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:53:00.352343 | orchestrator | 2026-01-06 00:53:00.352351 | orchestrator | PLAY [Apply role ovn-db] ******************************************************* 2026-01-06 00:53:00.352358 | orchestrator | 2026-01-06 00:53:00.352365 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2026-01-06 00:53:00.352372 | orchestrator | Tuesday 06 January 2026 00:50:15 +0000 (0:00:08.685) 0:00:48.481 ******* 2026-01-06 00:53:00.352380 | orchestrator | included: /ansible/roles/ovn-db/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:53:00.352387 | orchestrator | 2026-01-06 00:53:00.352394 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2026-01-06 00:53:00.352401 | orchestrator | Tuesday 06 January 2026 00:50:15 +0000 (0:00:00.596) 0:00:49.077 ******* 2026-01-06 00:53:00.352416 | orchestrator | included: /ansible/roles/ovn-db/tasks/lookup_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:53:00.352423 | orchestrator | 2026-01-06 00:53:00.352431 | orchestrator | TASK [ovn-db : Checking for any existing OVN DB container volumes] ************* 2026-01-06 00:53:00.352438 | orchestrator | Tuesday 06 January 2026 00:50:16 +0000 (0:00:00.930) 0:00:50.008 ******* 2026-01-06 00:53:00.352445 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352452 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352459 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352467 | orchestrator | 2026-01-06 00:53:00.352474 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB volume availability] *************** 2026-01-06 00:53:00.352481 | orchestrator | Tuesday 06 January 2026 00:50:18 +0000 (0:00:01.156) 0:00:51.164 ******* 2026-01-06 00:53:00.352488 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352495 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352503 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352510 | orchestrator | 2026-01-06 00:53:00.352517 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB volume availability] *************** 2026-01-06 00:53:00.352524 | orchestrator | Tuesday 06 January 2026 00:50:18 +0000 (0:00:00.453) 0:00:51.617 ******* 2026-01-06 00:53:00.352531 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352539 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352546 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352553 | orchestrator | 2026-01-06 00:53:00.352560 | orchestrator | TASK [ovn-db : Establish whether the OVN NB cluster has already existed] ******* 2026-01-06 00:53:00.352575 | orchestrator | Tuesday 06 January 2026 00:50:19 +0000 (0:00:00.662) 0:00:52.280 ******* 2026-01-06 00:53:00.352582 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352590 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352597 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352604 | orchestrator | 2026-01-06 00:53:00.352611 | orchestrator | TASK [ovn-db : Establish whether the OVN SB cluster has already existed] ******* 2026-01-06 00:53:00.352619 | orchestrator | Tuesday 06 January 2026 00:50:19 +0000 (0:00:00.335) 0:00:52.616 ******* 2026-01-06 00:53:00.352626 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.352633 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.352640 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.352648 | orchestrator | 2026-01-06 00:53:00.352655 | orchestrator | TASK [ovn-db : Check if running on all OVN NB DB hosts] ************************ 2026-01-06 00:53:00.352662 | orchestrator | Tuesday 06 January 2026 00:50:19 +0000 (0:00:00.345) 0:00:52.961 ******* 2026-01-06 00:53:00.352669 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.352677 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.352684 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.352691 | orchestrator | 2026-01-06 00:53:00.352698 | orchestrator | TASK [ovn-db : Check OVN NB service port liveness] ***************************** 2026-01-06 00:53:00.352706 | orchestrator | Tuesday 06 January 2026 00:50:20 +0000 (0:00:00.321) 0:00:53.283 ******* 2026-01-06 00:53:00.352713 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.352720 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.352727 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.352735 | orchestrator | 2026-01-06 00:53:00.352742 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB service port liveness] ************* 2026-01-06 00:53:00.352749 | orchestrator | Tuesday 06 January 2026 00:50:20 +0000 (0:00:00.637) 0:00:53.921 ******* 2026-01-06 00:53:00.352756 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.352764 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.352771 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.352778 | orchestrator | 2026-01-06 00:53:00.352785 | orchestrator | TASK [ovn-db : Get OVN NB database information] ******************************** 2026-01-06 00:53:00.352793 | orchestrator | Tuesday 06 January 2026 00:50:21 +0000 (0:00:00.401) 0:00:54.323 ******* 2026-01-06 00:53:00.352800 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.352807 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.352819 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.352827 | orchestrator | 2026-01-06 00:53:00.352834 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB leader/follower role] ************** 2026-01-06 00:53:00.352841 | orchestrator | Tuesday 06 January 2026 00:50:21 +0000 (0:00:00.376) 0:00:54.700 ******* 2026-01-06 00:53:00.352848 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.352856 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.352863 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.352870 | orchestrator | 2026-01-06 00:53:00.353015 | orchestrator | TASK [ovn-db : Fail on existing OVN NB cluster with no leader] ***************** 2026-01-06 00:53:00.353028 | orchestrator | Tuesday 06 January 2026 00:50:21 +0000 (0:00:00.301) 0:00:55.001 ******* 2026-01-06 00:53:00.353036 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353043 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353051 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353058 | orchestrator | 2026-01-06 00:53:00.353065 | orchestrator | TASK [ovn-db : Check if running on all OVN SB DB hosts] ************************ 2026-01-06 00:53:00.353072 | orchestrator | Tuesday 06 January 2026 00:50:22 +0000 (0:00:00.604) 0:00:55.605 ******* 2026-01-06 00:53:00.353080 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353087 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353094 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353101 | orchestrator | 2026-01-06 00:53:00.353109 | orchestrator | TASK [ovn-db : Check OVN SB service port liveness] ***************************** 2026-01-06 00:53:00.353116 | orchestrator | Tuesday 06 January 2026 00:50:22 +0000 (0:00:00.408) 0:00:56.014 ******* 2026-01-06 00:53:00.353123 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353130 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353138 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353145 | orchestrator | 2026-01-06 00:53:00.353153 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB service port liveness] ************* 2026-01-06 00:53:00.353160 | orchestrator | Tuesday 06 January 2026 00:50:23 +0000 (0:00:00.312) 0:00:56.326 ******* 2026-01-06 00:53:00.353167 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353174 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353182 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353189 | orchestrator | 2026-01-06 00:53:00.353196 | orchestrator | TASK [ovn-db : Get OVN SB database information] ******************************** 2026-01-06 00:53:00.353204 | orchestrator | Tuesday 06 January 2026 00:50:23 +0000 (0:00:00.308) 0:00:56.634 ******* 2026-01-06 00:53:00.353211 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353218 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353226 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353233 | orchestrator | 2026-01-06 00:53:00.353240 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB leader/follower role] ************** 2026-01-06 00:53:00.353247 | orchestrator | Tuesday 06 January 2026 00:50:23 +0000 (0:00:00.325) 0:00:56.960 ******* 2026-01-06 00:53:00.353255 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353266 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353278 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353288 | orchestrator | 2026-01-06 00:53:00.353298 | orchestrator | TASK [ovn-db : Fail on existing OVN SB cluster with no leader] ***************** 2026-01-06 00:53:00.353308 | orchestrator | Tuesday 06 January 2026 00:50:24 +0000 (0:00:00.691) 0:00:57.651 ******* 2026-01-06 00:53:00.353317 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.353330 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.353347 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.353358 | orchestrator | 2026-01-06 00:53:00.353369 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2026-01-06 00:53:00.353380 | orchestrator | Tuesday 06 January 2026 00:50:24 +0000 (0:00:00.364) 0:00:58.016 ******* 2026-01-06 00:53:00.353391 | orchestrator | included: /ansible/roles/ovn-db/tasks/bootstrap-initial.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:53:00.353403 | orchestrator | 2026-01-06 00:53:00.354089 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new cluster)] ******************* 2026-01-06 00:53:00.354118 | orchestrator | Tuesday 06 January 2026 00:50:25 +0000 (0:00:00.659) 0:00:58.676 ******* 2026-01-06 00:53:00.354126 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.354132 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.354139 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.354146 | orchestrator | 2026-01-06 00:53:00.354154 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new cluster)] ******************* 2026-01-06 00:53:00.354161 | orchestrator | Tuesday 06 January 2026 00:50:26 +0000 (0:00:00.876) 0:00:59.552 ******* 2026-01-06 00:53:00.354167 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.354174 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.354181 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.354188 | orchestrator | 2026-01-06 00:53:00.354195 | orchestrator | TASK [ovn-db : Check NB cluster status] **************************************** 2026-01-06 00:53:00.354202 | orchestrator | Tuesday 06 January 2026 00:50:26 +0000 (0:00:00.461) 0:01:00.014 ******* 2026-01-06 00:53:00.354209 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354216 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354222 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354229 | orchestrator | 2026-01-06 00:53:00.354236 | orchestrator | TASK [ovn-db : Check SB cluster status] **************************************** 2026-01-06 00:53:00.354242 | orchestrator | Tuesday 06 January 2026 00:50:27 +0000 (0:00:00.392) 0:01:00.407 ******* 2026-01-06 00:53:00.354249 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354256 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354262 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354269 | orchestrator | 2026-01-06 00:53:00.354276 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in NB DB] *** 2026-01-06 00:53:00.354283 | orchestrator | Tuesday 06 January 2026 00:50:27 +0000 (0:00:00.372) 0:01:00.779 ******* 2026-01-06 00:53:00.354290 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354296 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354303 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354310 | orchestrator | 2026-01-06 00:53:00.354317 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in SB DB] *** 2026-01-06 00:53:00.354341 | orchestrator | Tuesday 06 January 2026 00:50:28 +0000 (0:00:00.627) 0:01:01.406 ******* 2026-01-06 00:53:00.354379 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354459 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354473 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354482 | orchestrator | 2026-01-06 00:53:00.354492 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new member)] ******************** 2026-01-06 00:53:00.354505 | orchestrator | Tuesday 06 January 2026 00:50:28 +0000 (0:00:00.368) 0:01:01.775 ******* 2026-01-06 00:53:00.354516 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354527 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354568 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354576 | orchestrator | 2026-01-06 00:53:00.354582 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new member)] ******************** 2026-01-06 00:53:00.354589 | orchestrator | Tuesday 06 January 2026 00:50:28 +0000 (0:00:00.334) 0:01:02.110 ******* 2026-01-06 00:53:00.354595 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.354602 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.354609 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.354615 | orchestrator | 2026-01-06 00:53:00.354622 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2026-01-06 00:53:00.354630 | orchestrator | Tuesday 06 January 2026 00:50:29 +0000 (0:00:00.345) 0:01:02.455 ******* 2026-01-06 00:53:00.354641 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354665 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354674 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354694 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354703 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354711 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354723 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354733 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354748 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354757 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354765 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354781 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354789 | orchestrator | 2026-01-06 00:53:00.354797 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2026-01-06 00:53:00.354805 | orchestrator | Tuesday 06 January 2026 00:50:32 +0000 (0:00:02.791) 0:01:05.247 ******* 2026-01-06 00:53:00.354813 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354822 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354834 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354843 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354863 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354891 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354903 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354917 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354926 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354935 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354949 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.354962 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.354969 | orchestrator | 2026-01-06 00:53:00.354975 | orchestrator | TASK [ovn-db : Ensure configuration for relays exists] ************************* 2026-01-06 00:53:00.354982 | orchestrator | Tuesday 06 January 2026 00:50:38 +0000 (0:00:06.129) 0:01:11.377 ******* 2026-01-06 00:53:00.354990 | orchestrator | included: /ansible/roles/ovn-db/tasks/config-relay.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=1) 2026-01-06 00:53:00.354996 | orchestrator | 2026-01-06 00:53:00.355003 | orchestrator | TASK [ovn-db : Ensuring config directories exist for OVN relay containers] ***** 2026-01-06 00:53:00.355013 | orchestrator | Tuesday 06 January 2026 00:50:38 +0000 (0:00:00.665) 0:01:12.042 ******* 2026-01-06 00:53:00.355027 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.355043 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.355054 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.355064 | orchestrator | 2026-01-06 00:53:00.355075 | orchestrator | TASK [ovn-db : Copying over config.json files for OVN relay services] ********** 2026-01-06 00:53:00.355085 | orchestrator | Tuesday 06 January 2026 00:50:39 +0000 (0:00:00.948) 0:01:12.991 ******* 2026-01-06 00:53:00.355095 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.355106 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.355116 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.355126 | orchestrator | 2026-01-06 00:53:00.355137 | orchestrator | TASK [ovn-db : Generate config files for OVN relay services] ******************* 2026-01-06 00:53:00.355148 | orchestrator | Tuesday 06 January 2026 00:50:41 +0000 (0:00:01.654) 0:01:14.646 ******* 2026-01-06 00:53:00.355158 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.355168 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.355178 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.355190 | orchestrator | 2026-01-06 00:53:00.355200 | orchestrator | TASK [service-check-containers : ovn_db | Check containers] ******************** 2026-01-06 00:53:00.355210 | orchestrator | Tuesday 06 January 2026 00:50:43 +0000 (0:00:01.723) 0:01:16.369 ******* 2026-01-06 00:53:00.355230 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355241 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355252 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355270 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355286 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355298 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355309 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355321 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355339 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355351 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355363 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355382 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355394 | orchestrator | 2026-01-06 00:53:00.355405 | orchestrator | TASK [service-check-containers : ovn_db | Notify handlers to restart containers] *** 2026-01-06 00:53:00.355417 | orchestrator | Tuesday 06 January 2026 00:50:48 +0000 (0:00:05.557) 0:01:21.927 ******* 2026-01-06 00:53:00.355433 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:53:00.355446 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355457 | orchestrator | } 2026-01-06 00:53:00.355469 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:53:00.355479 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355491 | orchestrator | } 2026-01-06 00:53:00.355508 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:53:00.355520 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355531 | orchestrator | } 2026-01-06 00:53:00.355541 | orchestrator | 2026-01-06 00:53:00.355552 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:53:00.355562 | orchestrator | Tuesday 06 January 2026 00:50:49 +0000 (0:00:00.436) 0:01:22.364 ******* 2026-01-06 00:53:00.355573 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355585 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355597 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355627 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355648 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355656 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355667 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355674 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355681 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.355688 | orchestrator | included: /ansible/roles/service-check-containers/tasks/iterated.yml for testbed-node-0, testbed-node-2, testbed-node-1 => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.355695 | orchestrator | 2026-01-06 00:53:00.355702 | orchestrator | TASK [service-check-containers : ovn_db | Check containers with iteration] ***** 2026-01-06 00:53:00.355709 | orchestrator | Tuesday 06 January 2026 00:50:52 +0000 (0:00:02.993) 0:01:25.357 ******* 2026-01-06 00:53:00.355716 | orchestrator | changed: [testbed-node-0] => (item=[1]) 2026-01-06 00:53:00.355723 | orchestrator | changed: [testbed-node-1] => (item=[1]) 2026-01-06 00:53:00.355730 | orchestrator | changed: [testbed-node-2] => (item=[1]) 2026-01-06 00:53:00.355736 | orchestrator | 2026-01-06 00:53:00.355743 | orchestrator | TASK [service-check-containers : ovn_db | Notify handlers to restart containers] *** 2026-01-06 00:53:00.355750 | orchestrator | Tuesday 06 January 2026 00:50:53 +0000 (0:00:01.000) 0:01:26.358 ******* 2026-01-06 00:53:00.355757 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:53:00.355769 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355776 | orchestrator | } 2026-01-06 00:53:00.355783 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:53:00.355790 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355796 | orchestrator | } 2026-01-06 00:53:00.355803 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:53:00.355810 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.355821 | orchestrator | } 2026-01-06 00:53:00.355828 | orchestrator | 2026-01-06 00:53:00.355835 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.355841 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.806) 0:01:27.164 ******* 2026-01-06 00:53:00.355848 | orchestrator | 2026-01-06 00:53:00.355855 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.355862 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.069) 0:01:27.233 ******* 2026-01-06 00:53:00.355868 | orchestrator | 2026-01-06 00:53:00.355905 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.355912 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.068) 0:01:27.302 ******* 2026-01-06 00:53:00.355919 | orchestrator | 2026-01-06 00:53:00.355926 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2026-01-06 00:53:00.355932 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.064) 0:01:27.367 ******* 2026-01-06 00:53:00.355939 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.355946 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.355953 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.355960 | orchestrator | 2026-01-06 00:53:00.355967 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2026-01-06 00:53:00.355973 | orchestrator | Tuesday 06 January 2026 00:51:04 +0000 (0:00:10.391) 0:01:37.758 ******* 2026-01-06 00:53:00.355980 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.355987 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.355993 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.356000 | orchestrator | 2026-01-06 00:53:00.356007 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db-relay container] ******************* 2026-01-06 00:53:00.356014 | orchestrator | Tuesday 06 January 2026 00:51:17 +0000 (0:00:13.110) 0:01:50.869 ******* 2026-01-06 00:53:00.356021 | orchestrator | changed: [testbed-node-0] => (item=1) 2026-01-06 00:53:00.356027 | orchestrator | changed: [testbed-node-1] => (item=1) 2026-01-06 00:53:00.356034 | orchestrator | changed: [testbed-node-2] => (item=1) 2026-01-06 00:53:00.356041 | orchestrator | 2026-01-06 00:53:00.356047 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-northd container] ************************ 2026-01-06 00:53:00.356054 | orchestrator | Tuesday 06 January 2026 00:51:28 +0000 (0:00:11.160) 0:02:02.029 ******* 2026-01-06 00:53:00.356061 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.356067 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.356074 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.356081 | orchestrator | 2026-01-06 00:53:00.356088 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2026-01-06 00:53:00.356098 | orchestrator | Tuesday 06 January 2026 00:51:42 +0000 (0:00:13.326) 0:02:15.356 ******* 2026-01-06 00:53:00.356105 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.356112 | orchestrator | 2026-01-06 00:53:00.356119 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2026-01-06 00:53:00.356126 | orchestrator | Tuesday 06 January 2026 00:51:42 +0000 (0:00:00.155) 0:02:15.511 ******* 2026-01-06 00:53:00.356132 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356139 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356146 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356153 | orchestrator | 2026-01-06 00:53:00.356160 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2026-01-06 00:53:00.356166 | orchestrator | Tuesday 06 January 2026 00:51:43 +0000 (0:00:00.877) 0:02:16.389 ******* 2026-01-06 00:53:00.356181 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.356188 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.356194 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.356201 | orchestrator | 2026-01-06 00:53:00.356208 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2026-01-06 00:53:00.356214 | orchestrator | Tuesday 06 January 2026 00:51:43 +0000 (0:00:00.667) 0:02:17.056 ******* 2026-01-06 00:53:00.356221 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356228 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356235 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356241 | orchestrator | 2026-01-06 00:53:00.356248 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2026-01-06 00:53:00.356255 | orchestrator | Tuesday 06 January 2026 00:51:45 +0000 (0:00:01.210) 0:02:18.267 ******* 2026-01-06 00:53:00.356261 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.356268 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.356275 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.356282 | orchestrator | 2026-01-06 00:53:00.356288 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2026-01-06 00:53:00.356295 | orchestrator | Tuesday 06 January 2026 00:51:45 +0000 (0:00:00.563) 0:02:18.830 ******* 2026-01-06 00:53:00.356302 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356308 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356315 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356322 | orchestrator | 2026-01-06 00:53:00.356329 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2026-01-06 00:53:00.356335 | orchestrator | Tuesday 06 January 2026 00:51:46 +0000 (0:00:00.724) 0:02:19.555 ******* 2026-01-06 00:53:00.356342 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356349 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356356 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356363 | orchestrator | 2026-01-06 00:53:00.356370 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db-relay] *************************************** 2026-01-06 00:53:00.356376 | orchestrator | Tuesday 06 January 2026 00:51:47 +0000 (0:00:00.701) 0:02:20.257 ******* 2026-01-06 00:53:00.356383 | orchestrator | ok: [testbed-node-0] => (item=1) 2026-01-06 00:53:00.356390 | orchestrator | ok: [testbed-node-1] => (item=1) 2026-01-06 00:53:00.356397 | orchestrator | ok: [testbed-node-2] => (item=1) 2026-01-06 00:53:00.356403 | orchestrator | 2026-01-06 00:53:00.356410 | orchestrator | TASK [ovn-db : Unset bootstrap args fact] ************************************** 2026-01-06 00:53:00.356417 | orchestrator | Tuesday 06 January 2026 00:51:48 +0000 (0:00:01.234) 0:02:21.491 ******* 2026-01-06 00:53:00.356424 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356430 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356437 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356449 | orchestrator | 2026-01-06 00:53:00.356464 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2026-01-06 00:53:00.356486 | orchestrator | Tuesday 06 January 2026 00:51:48 +0000 (0:00:00.332) 0:02:21.823 ******* 2026-01-06 00:53:00.356498 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356510 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356531 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356549 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356562 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356574 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356586 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356606 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356619 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356633 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356654 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356677 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356689 | orchestrator | 2026-01-06 00:53:00.356700 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2026-01-06 00:53:00.356708 | orchestrator | Tuesday 06 January 2026 00:51:51 +0000 (0:00:02.781) 0:02:24.604 ******* 2026-01-06 00:53:00.356715 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356722 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356729 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356741 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356749 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356763 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356770 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356780 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356788 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356795 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356802 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.356814 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.356822 | orchestrator | 2026-01-06 00:53:00.356828 | orchestrator | TASK [ovn-db : Ensure configuration for relays exists] ************************* 2026-01-06 00:53:00.356835 | orchestrator | Tuesday 06 January 2026 00:51:56 +0000 (0:00:05.253) 0:02:29.858 ******* 2026-01-06 00:53:00.356848 | orchestrator | included: /ansible/roles/ovn-db/tasks/config-relay.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=1) 2026-01-06 00:53:00.356856 | orchestrator | 2026-01-06 00:53:00.356862 | orchestrator | TASK [ovn-db : Ensuring config directories exist for OVN relay containers] ***** 2026-01-06 00:53:00.356869 | orchestrator | Tuesday 06 January 2026 00:51:57 +0000 (0:00:00.937) 0:02:30.796 ******* 2026-01-06 00:53:00.356919 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356926 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356933 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356940 | orchestrator | 2026-01-06 00:53:00.356946 | orchestrator | TASK [ovn-db : Copying over config.json files for OVN relay services] ********** 2026-01-06 00:53:00.356953 | orchestrator | Tuesday 06 January 2026 00:51:58 +0000 (0:00:00.728) 0:02:31.524 ******* 2026-01-06 00:53:00.356960 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.356967 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.356973 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.356980 | orchestrator | 2026-01-06 00:53:00.356987 | orchestrator | TASK [ovn-db : Generate config files for OVN relay services] ******************* 2026-01-06 00:53:00.356994 | orchestrator | Tuesday 06 January 2026 00:52:00 +0000 (0:00:01.701) 0:02:33.226 ******* 2026-01-06 00:53:00.357000 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.357007 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.357014 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.357021 | orchestrator | 2026-01-06 00:53:00.357027 | orchestrator | TASK [service-check-containers : ovn_db | Check containers] ******************** 2026-01-06 00:53:00.357034 | orchestrator | Tuesday 06 January 2026 00:52:02 +0000 (0:00:02.360) 0:02:35.587 ******* 2026-01-06 00:53:00.357045 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357053 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357060 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357067 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357074 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357091 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357098 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357105 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357116 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357123 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357130 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357137 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357144 | orchestrator | 2026-01-06 00:53:00.357152 | orchestrator | TASK [service-check-containers : ovn_db | Notify handlers to restart containers] *** 2026-01-06 00:53:00.357163 | orchestrator | Tuesday 06 January 2026 00:52:07 +0000 (0:00:05.277) 0:02:40.864 ******* 2026-01-06 00:53:00.357170 | orchestrator | ok: [testbed-node-0] => { 2026-01-06 00:53:00.357177 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357184 | orchestrator | } 2026-01-06 00:53:00.357190 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:53:00.357197 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357204 | orchestrator | } 2026-01-06 00:53:00.357211 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:53:00.357217 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357224 | orchestrator | } 2026-01-06 00:53:00.357231 | orchestrator | 2026-01-06 00:53:00.357238 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:53:00.357244 | orchestrator | Tuesday 06 January 2026 00:52:08 +0000 (0:00:00.385) 0:02:41.250 ******* 2026-01-06 00:53:00.357257 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357265 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357272 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357283 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357290 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357297 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357309 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641', 'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-northd:2025.1', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357317 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'environment': {'OVN_NB_DB': 'tcp:192.168.16.10:6641,tcp:192.168.16.11:6641,tcp:192.168.16.12:6641'}, 'image': 'registry.osism.tech/kolla/ovn-nb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357328 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'environment': {'OVN_SB_DB': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-server:2025.1', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:53:00.357336 | orchestrator | included: /ansible/roles/service-check-containers/tasks/iterated.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'ovn-sb-db-relay', 'value': {'container_name': 'ovn_sb_db_relay', 'group': 'ovn-sb-db-relay', 'enabled': True, 'environment': {'RELAY_ID': '1'}, 'image': 'registry.osism.tech/kolla/ovn-sb-db-relay:2025.1', 'iterate': True, 'iterate_var': '1', 'volumes': ['/etc/kolla/ovn-sb-db-relay/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 00:53:00.357342 | orchestrator | 2026-01-06 00:53:00.357349 | orchestrator | TASK [service-check-containers : ovn_db | Check containers with iteration] ***** 2026-01-06 00:53:00.357356 | orchestrator | Tuesday 06 January 2026 00:52:10 +0000 (0:00:02.168) 0:02:43.418 ******* 2026-01-06 00:53:00.357363 | orchestrator | changed: [testbed-node-0] => (item=[1]) 2026-01-06 00:53:00.357370 | orchestrator | changed: [testbed-node-1] => (item=[1]) 2026-01-06 00:53:00.357377 | orchestrator | changed: [testbed-node-2] => (item=[1]) 2026-01-06 00:53:00.357383 | orchestrator | 2026-01-06 00:53:00.357390 | orchestrator | TASK [service-check-containers : ovn_db | Notify handlers to restart containers] *** 2026-01-06 00:53:00.357397 | orchestrator | Tuesday 06 January 2026 00:52:11 +0000 (0:00:01.385) 0:02:44.804 ******* 2026-01-06 00:53:00.357403 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:53:00.357410 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357417 | orchestrator | } 2026-01-06 00:53:00.357424 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:53:00.357430 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357437 | orchestrator | } 2026-01-06 00:53:00.357444 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:53:00.357451 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:53:00.357457 | orchestrator | } 2026-01-06 00:53:00.357464 | orchestrator | 2026-01-06 00:53:00.357474 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.357481 | orchestrator | Tuesday 06 January 2026 00:52:12 +0000 (0:00:00.905) 0:02:45.709 ******* 2026-01-06 00:53:00.357488 | orchestrator | 2026-01-06 00:53:00.357495 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.357506 | orchestrator | Tuesday 06 January 2026 00:52:12 +0000 (0:00:00.146) 0:02:45.855 ******* 2026-01-06 00:53:00.357513 | orchestrator | 2026-01-06 00:53:00.357520 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2026-01-06 00:53:00.357526 | orchestrator | Tuesday 06 January 2026 00:52:12 +0000 (0:00:00.144) 0:02:46.000 ******* 2026-01-06 00:53:00.357533 | orchestrator | 2026-01-06 00:53:00.357540 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2026-01-06 00:53:00.357547 | orchestrator | Tuesday 06 January 2026 00:52:13 +0000 (0:00:00.128) 0:02:46.129 ******* 2026-01-06 00:53:00.357554 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.357560 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.357567 | orchestrator | 2026-01-06 00:53:00.357574 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2026-01-06 00:53:00.357580 | orchestrator | Tuesday 06 January 2026 00:52:25 +0000 (0:00:12.189) 0:02:58.319 ******* 2026-01-06 00:53:00.357587 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:53:00.357594 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:53:00.357601 | orchestrator | 2026-01-06 00:53:00.357607 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db-relay container] ******************* 2026-01-06 00:53:00.357614 | orchestrator | Tuesday 06 January 2026 00:52:37 +0000 (0:00:12.536) 0:03:10.855 ******* 2026-01-06 00:53:00.357621 | orchestrator | changed: [testbed-node-0] => (item=1) 2026-01-06 00:53:00.357628 | orchestrator | changed: [testbed-node-2] => (item=1) 2026-01-06 00:53:00.357634 | orchestrator | changed: [testbed-node-1] => (item=1) 2026-01-06 00:53:00.357641 | orchestrator | 2026-01-06 00:53:00.357647 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2026-01-06 00:53:00.357654 | orchestrator | Tuesday 06 January 2026 00:52:51 +0000 (0:00:13.832) 0:03:24.687 ******* 2026-01-06 00:53:00.357661 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:53:00.357667 | orchestrator | 2026-01-06 00:53:00.357674 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2026-01-06 00:53:00.357681 | orchestrator | Tuesday 06 January 2026 00:52:51 +0000 (0:00:00.133) 0:03:24.821 ******* 2026-01-06 00:53:00.357688 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.357694 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.357701 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.357708 | orchestrator | 2026-01-06 00:53:00.357715 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2026-01-06 00:53:00.357722 | orchestrator | Tuesday 06 January 2026 00:52:52 +0000 (0:00:00.794) 0:03:25.616 ******* 2026-01-06 00:53:00.357728 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.357735 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.357742 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.357749 | orchestrator | 2026-01-06 00:53:00.357755 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2026-01-06 00:53:00.357762 | orchestrator | Tuesday 06 January 2026 00:52:53 +0000 (0:00:00.947) 0:03:26.563 ******* 2026-01-06 00:53:00.357769 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.357776 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.357782 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.357789 | orchestrator | 2026-01-06 00:53:00.357796 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2026-01-06 00:53:00.357807 | orchestrator | Tuesday 06 January 2026 00:52:54 +0000 (0:00:00.812) 0:03:27.375 ******* 2026-01-06 00:53:00.357814 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:53:00.357820 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:53:00.357827 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:53:00.357834 | orchestrator | 2026-01-06 00:53:00.357841 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2026-01-06 00:53:00.357847 | orchestrator | Tuesday 06 January 2026 00:52:54 +0000 (0:00:00.701) 0:03:28.077 ******* 2026-01-06 00:53:00.357854 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.357861 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.357890 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.357898 | orchestrator | 2026-01-06 00:53:00.357905 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2026-01-06 00:53:00.357912 | orchestrator | Tuesday 06 January 2026 00:52:55 +0000 (0:00:00.775) 0:03:28.853 ******* 2026-01-06 00:53:00.357919 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:53:00.357926 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:53:00.357932 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:53:00.357939 | orchestrator | 2026-01-06 00:53:00.357946 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db-relay] *************************************** 2026-01-06 00:53:00.357953 | orchestrator | Tuesday 06 January 2026 00:52:56 +0000 (0:00:00.892) 0:03:29.745 ******* 2026-01-06 00:53:00.357959 | orchestrator | ok: [testbed-node-0] => (item=1) 2026-01-06 00:53:00.357966 | orchestrator | ok: [testbed-node-1] => (item=1) 2026-01-06 00:53:00.357973 | orchestrator | ok: [testbed-node-2] => (item=1) 2026-01-06 00:53:00.357980 | orchestrator | 2026-01-06 00:53:00.357986 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:53:00.357994 | orchestrator | testbed-node-0 : ok=65  changed=29  unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2026-01-06 00:53:00.358001 | orchestrator | testbed-node-1 : ok=63  changed=30  unreachable=0 failed=0 skipped=23  rescued=0 ignored=0 2026-01-06 00:53:00.358008 | orchestrator | testbed-node-2 : ok=63  changed=30  unreachable=0 failed=0 skipped=23  rescued=0 ignored=0 2026-01-06 00:53:00.358064 | orchestrator | testbed-node-3 : ok=13  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:53:00.358081 | orchestrator | testbed-node-4 : ok=13  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:53:00.358094 | orchestrator | testbed-node-5 : ok=13  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 00:53:00.358105 | orchestrator | 2026-01-06 00:53:00.358118 | orchestrator | 2026-01-06 00:53:00.358127 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:53:00.358134 | orchestrator | Tuesday 06 January 2026 00:52:58 +0000 (0:00:01.421) 0:03:31.167 ******* 2026-01-06 00:53:00.358141 | orchestrator | =============================================================================== 2026-01-06 00:53:00.358148 | orchestrator | ovn-db : Restart ovn-sb-db container ----------------------------------- 25.65s 2026-01-06 00:53:00.358155 | orchestrator | ovn-db : Restart ovn-sb-db-relay container ----------------------------- 24.99s 2026-01-06 00:53:00.358162 | orchestrator | ovn-db : Restart ovn-nb-db container ----------------------------------- 22.58s 2026-01-06 00:53:00.358168 | orchestrator | ovn-controller : Configure OVN in OVSDB -------------------------------- 20.80s 2026-01-06 00:53:00.358175 | orchestrator | ovn-db : Restart ovn-northd container ---------------------------------- 13.33s 2026-01-06 00:53:00.358182 | orchestrator | ovn-controller : Restart ovn-controller container ----------------------- 8.69s 2026-01-06 00:53:00.358188 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 6.13s 2026-01-06 00:53:00.358195 | orchestrator | service-check-containers : ovn_db | Check containers -------------------- 5.56s 2026-01-06 00:53:00.358202 | orchestrator | service-check-containers : ovn_db | Check containers -------------------- 5.28s 2026-01-06 00:53:00.358209 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 5.25s 2026-01-06 00:53:00.358215 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.99s 2026-01-06 00:53:00.358222 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 2.86s 2026-01-06 00:53:00.358229 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 2.79s 2026-01-06 00:53:00.358242 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 2.78s 2026-01-06 00:53:00.358248 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 2.43s 2026-01-06 00:53:00.358255 | orchestrator | ovn-db : Generate config files for OVN relay services ------------------- 2.36s 2026-01-06 00:53:00.358262 | orchestrator | ovn-controller : Reload systemd config ---------------------------------- 2.19s 2026-01-06 00:53:00.358269 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.17s 2026-01-06 00:53:00.358275 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 2.10s 2026-01-06 00:53:00.358282 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.87s 2026-01-06 00:53:00.358289 | orchestrator | 2026-01-06 00:53:00 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:00.358296 | orchestrator | 2026-01-06 00:53:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:03.392179 | orchestrator | 2026-01-06 00:53:03 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:03.394623 | orchestrator | 2026-01-06 00:53:03 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:03.395199 | orchestrator | 2026-01-06 00:53:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:06.445723 | orchestrator | 2026-01-06 00:53:06 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:06.447289 | orchestrator | 2026-01-06 00:53:06 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:06.447346 | orchestrator | 2026-01-06 00:53:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:09.485416 | orchestrator | 2026-01-06 00:53:09 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:09.487114 | orchestrator | 2026-01-06 00:53:09 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:09.487544 | orchestrator | 2026-01-06 00:53:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:12.536478 | orchestrator | 2026-01-06 00:53:12 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:12.538323 | orchestrator | 2026-01-06 00:53:12 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:12.538747 | orchestrator | 2026-01-06 00:53:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:15.585459 | orchestrator | 2026-01-06 00:53:15 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:15.586768 | orchestrator | 2026-01-06 00:53:15 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:15.589409 | orchestrator | 2026-01-06 00:53:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:18.632921 | orchestrator | 2026-01-06 00:53:18 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:18.636029 | orchestrator | 2026-01-06 00:53:18 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:18.636540 | orchestrator | 2026-01-06 00:53:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:21.683162 | orchestrator | 2026-01-06 00:53:21 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:21.684453 | orchestrator | 2026-01-06 00:53:21 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:21.684986 | orchestrator | 2026-01-06 00:53:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:24.726623 | orchestrator | 2026-01-06 00:53:24 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:24.729345 | orchestrator | 2026-01-06 00:53:24 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:24.729434 | orchestrator | 2026-01-06 00:53:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:27.782122 | orchestrator | 2026-01-06 00:53:27 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:27.783158 | orchestrator | 2026-01-06 00:53:27 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:27.783472 | orchestrator | 2026-01-06 00:53:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:30.820291 | orchestrator | 2026-01-06 00:53:30 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:30.820590 | orchestrator | 2026-01-06 00:53:30 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:30.820699 | orchestrator | 2026-01-06 00:53:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:33.872222 | orchestrator | 2026-01-06 00:53:33 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:33.873705 | orchestrator | 2026-01-06 00:53:33 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:33.873745 | orchestrator | 2026-01-06 00:53:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:36.923195 | orchestrator | 2026-01-06 00:53:36 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:36.925545 | orchestrator | 2026-01-06 00:53:36 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:36.925595 | orchestrator | 2026-01-06 00:53:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:39.964371 | orchestrator | 2026-01-06 00:53:39 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:39.965728 | orchestrator | 2026-01-06 00:53:39 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:39.965914 | orchestrator | 2026-01-06 00:53:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:43.012125 | orchestrator | 2026-01-06 00:53:43 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:43.013704 | orchestrator | 2026-01-06 00:53:43 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:43.013757 | orchestrator | 2026-01-06 00:53:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:46.075937 | orchestrator | 2026-01-06 00:53:46 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:46.078601 | orchestrator | 2026-01-06 00:53:46 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:46.078669 | orchestrator | 2026-01-06 00:53:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:49.117583 | orchestrator | 2026-01-06 00:53:49 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:49.117734 | orchestrator | 2026-01-06 00:53:49 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:49.117857 | orchestrator | 2026-01-06 00:53:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:52.163404 | orchestrator | 2026-01-06 00:53:52 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:52.166196 | orchestrator | 2026-01-06 00:53:52 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:52.166726 | orchestrator | 2026-01-06 00:53:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:55.222849 | orchestrator | 2026-01-06 00:53:55 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:55.232984 | orchestrator | 2026-01-06 00:53:55 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:55.233071 | orchestrator | 2026-01-06 00:53:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:53:58.274661 | orchestrator | 2026-01-06 00:53:58 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:53:58.277810 | orchestrator | 2026-01-06 00:53:58 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:53:58.277897 | orchestrator | 2026-01-06 00:53:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:01.314947 | orchestrator | 2026-01-06 00:54:01 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:01.318693 | orchestrator | 2026-01-06 00:54:01 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:01.318827 | orchestrator | 2026-01-06 00:54:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:04.364279 | orchestrator | 2026-01-06 00:54:04 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:04.365210 | orchestrator | 2026-01-06 00:54:04 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:04.365300 | orchestrator | 2026-01-06 00:54:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:07.401604 | orchestrator | 2026-01-06 00:54:07 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:07.403027 | orchestrator | 2026-01-06 00:54:07 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:07.403090 | orchestrator | 2026-01-06 00:54:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:10.462749 | orchestrator | 2026-01-06 00:54:10 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:10.464407 | orchestrator | 2026-01-06 00:54:10 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:10.464462 | orchestrator | 2026-01-06 00:54:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:13.516247 | orchestrator | 2026-01-06 00:54:13 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:13.518210 | orchestrator | 2026-01-06 00:54:13 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:13.518307 | orchestrator | 2026-01-06 00:54:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:16.570436 | orchestrator | 2026-01-06 00:54:16 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:16.571618 | orchestrator | 2026-01-06 00:54:16 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:16.571692 | orchestrator | 2026-01-06 00:54:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:19.615020 | orchestrator | 2026-01-06 00:54:19 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:19.616021 | orchestrator | 2026-01-06 00:54:19 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:19.616060 | orchestrator | 2026-01-06 00:54:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:22.652625 | orchestrator | 2026-01-06 00:54:22 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:22.652899 | orchestrator | 2026-01-06 00:54:22 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:22.652935 | orchestrator | 2026-01-06 00:54:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:25.690945 | orchestrator | 2026-01-06 00:54:25 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:25.692477 | orchestrator | 2026-01-06 00:54:25 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:25.692514 | orchestrator | 2026-01-06 00:54:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:28.736542 | orchestrator | 2026-01-06 00:54:28 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:28.738362 | orchestrator | 2026-01-06 00:54:28 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:28.738409 | orchestrator | 2026-01-06 00:54:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:31.782011 | orchestrator | 2026-01-06 00:54:31 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:31.783267 | orchestrator | 2026-01-06 00:54:31 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:31.783412 | orchestrator | 2026-01-06 00:54:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:34.828929 | orchestrator | 2026-01-06 00:54:34 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:34.829398 | orchestrator | 2026-01-06 00:54:34 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:34.829697 | orchestrator | 2026-01-06 00:54:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:37.886523 | orchestrator | 2026-01-06 00:54:37 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:37.888079 | orchestrator | 2026-01-06 00:54:37 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:37.888136 | orchestrator | 2026-01-06 00:54:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:40.933276 | orchestrator | 2026-01-06 00:54:40 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:40.935624 | orchestrator | 2026-01-06 00:54:40 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:40.935688 | orchestrator | 2026-01-06 00:54:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:43.975132 | orchestrator | 2026-01-06 00:54:43 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:43.976236 | orchestrator | 2026-01-06 00:54:43 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:43.977077 | orchestrator | 2026-01-06 00:54:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:47.019960 | orchestrator | 2026-01-06 00:54:47 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:47.025447 | orchestrator | 2026-01-06 00:54:47 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:47.025501 | orchestrator | 2026-01-06 00:54:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:50.072667 | orchestrator | 2026-01-06 00:54:50 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:50.075351 | orchestrator | 2026-01-06 00:54:50 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:50.075410 | orchestrator | 2026-01-06 00:54:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:53.121578 | orchestrator | 2026-01-06 00:54:53 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:53.124478 | orchestrator | 2026-01-06 00:54:53 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:53.124559 | orchestrator | 2026-01-06 00:54:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:56.166127 | orchestrator | 2026-01-06 00:54:56 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:56.168017 | orchestrator | 2026-01-06 00:54:56 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:56.168061 | orchestrator | 2026-01-06 00:54:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:54:59.217735 | orchestrator | 2026-01-06 00:54:59 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:54:59.218122 | orchestrator | 2026-01-06 00:54:59 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:54:59.218151 | orchestrator | 2026-01-06 00:54:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:02.266284 | orchestrator | 2026-01-06 00:55:02 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state STARTED 2026-01-06 00:55:02.268058 | orchestrator | 2026-01-06 00:55:02 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:02.268133 | orchestrator | 2026-01-06 00:55:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:05.332358 | orchestrator | 2026-01-06 00:55:05 | INFO  | Task c4bd3ef8-fced-4431-8d7e-2c6d742d735d is in state SUCCESS 2026-01-06 00:55:05.335745 | orchestrator | 2026-01-06 00:55:05.335837 | orchestrator | 2026-01-06 00:55:05.335861 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:55:05.335883 | orchestrator | 2026-01-06 00:55:05.335900 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:55:05.335918 | orchestrator | Tuesday 06 January 2026 00:48:09 +0000 (0:00:00.285) 0:00:00.285 ******* 2026-01-06 00:55:05.335935 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.335953 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.335969 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.335985 | orchestrator | 2026-01-06 00:55:05.336002 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:55:05.336030 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:00.400) 0:00:00.686 ******* 2026-01-06 00:55:05.336044 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2026-01-06 00:55:05.336055 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2026-01-06 00:55:05.336065 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2026-01-06 00:55:05.336075 | orchestrator | 2026-01-06 00:55:05.336084 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2026-01-06 00:55:05.336094 | orchestrator | 2026-01-06 00:55:05.336104 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-01-06 00:55:05.336114 | orchestrator | Tuesday 06 January 2026 00:48:11 +0000 (0:00:01.328) 0:00:02.015 ******* 2026-01-06 00:55:05.336125 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.336136 | orchestrator | 2026-01-06 00:55:05.336146 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2026-01-06 00:55:05.336155 | orchestrator | Tuesday 06 January 2026 00:48:12 +0000 (0:00:01.173) 0:00:03.188 ******* 2026-01-06 00:55:05.336165 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.336175 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.336185 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.336195 | orchestrator | 2026-01-06 00:55:05.336205 | orchestrator | TASK [Setting sysctl values] *************************************************** 2026-01-06 00:55:05.336218 | orchestrator | Tuesday 06 January 2026 00:48:13 +0000 (0:00:00.799) 0:00:03.987 ******* 2026-01-06 00:55:05.336231 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.336265 | orchestrator | 2026-01-06 00:55:05.336277 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2026-01-06 00:55:05.336289 | orchestrator | Tuesday 06 January 2026 00:48:14 +0000 (0:00:01.188) 0:00:05.176 ******* 2026-01-06 00:55:05.336301 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.336312 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.336324 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.336335 | orchestrator | 2026-01-06 00:55:05.336347 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2026-01-06 00:55:05.336358 | orchestrator | Tuesday 06 January 2026 00:48:15 +0000 (0:00:00.913) 0:00:06.090 ******* 2026-01-06 00:55:05.336370 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336382 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336393 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336405 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336416 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336428 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-01-06 00:55:05.336440 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-01-06 00:55:05.336455 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-01-06 00:55:05.336471 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-01-06 00:55:05.336488 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-01-06 00:55:05.336502 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-01-06 00:55:05.336522 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-01-06 00:55:05.336544 | orchestrator | 2026-01-06 00:55:05.336560 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-01-06 00:55:05.336575 | orchestrator | Tuesday 06 January 2026 00:48:19 +0000 (0:00:03.658) 0:00:09.748 ******* 2026-01-06 00:55:05.336591 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-01-06 00:55:05.336606 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-01-06 00:55:05.336886 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-01-06 00:55:05.336908 | orchestrator | 2026-01-06 00:55:05.336926 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-01-06 00:55:05.337169 | orchestrator | Tuesday 06 January 2026 00:48:20 +0000 (0:00:00.909) 0:00:10.658 ******* 2026-01-06 00:55:05.337195 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-01-06 00:55:05.337206 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-01-06 00:55:05.337216 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-01-06 00:55:05.337225 | orchestrator | 2026-01-06 00:55:05.337236 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-01-06 00:55:05.337246 | orchestrator | Tuesday 06 January 2026 00:48:22 +0000 (0:00:02.018) 0:00:12.676 ******* 2026-01-06 00:55:05.337256 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2026-01-06 00:55:05.337266 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.337297 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2026-01-06 00:55:05.337307 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.337317 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2026-01-06 00:55:05.337326 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.337336 | orchestrator | 2026-01-06 00:55:05.337346 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2026-01-06 00:55:05.337370 | orchestrator | Tuesday 06 January 2026 00:48:22 +0000 (0:00:00.847) 0:00:13.524 ******* 2026-01-06 00:55:05.337392 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337410 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337421 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337432 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337442 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337459 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.337481 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.337493 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.337503 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.337513 | orchestrator | 2026-01-06 00:55:05.337524 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2026-01-06 00:55:05.337534 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:02.377) 0:00:15.901 ******* 2026-01-06 00:55:05.337544 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.337554 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.337578 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.337633 | orchestrator | 2026-01-06 00:55:05.337654 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2026-01-06 00:55:05.337664 | orchestrator | Tuesday 06 January 2026 00:48:26 +0000 (0:00:01.398) 0:00:17.299 ******* 2026-01-06 00:55:05.337717 | orchestrator | changed: [testbed-node-2] => (item=users) 2026-01-06 00:55:05.337729 | orchestrator | changed: [testbed-node-1] => (item=users) 2026-01-06 00:55:05.337739 | orchestrator | changed: [testbed-node-0] => (item=users) 2026-01-06 00:55:05.337748 | orchestrator | changed: [testbed-node-2] => (item=rules) 2026-01-06 00:55:05.337758 | orchestrator | changed: [testbed-node-1] => (item=rules) 2026-01-06 00:55:05.337768 | orchestrator | changed: [testbed-node-0] => (item=rules) 2026-01-06 00:55:05.337777 | orchestrator | 2026-01-06 00:55:05.337787 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2026-01-06 00:55:05.337796 | orchestrator | Tuesday 06 January 2026 00:48:29 +0000 (0:00:02.766) 0:00:20.066 ******* 2026-01-06 00:55:05.337806 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.337816 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.337825 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.337835 | orchestrator | 2026-01-06 00:55:05.337845 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2026-01-06 00:55:05.337854 | orchestrator | Tuesday 06 January 2026 00:48:30 +0000 (0:00:01.155) 0:00:21.221 ******* 2026-01-06 00:55:05.337864 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.337875 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.337884 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.337894 | orchestrator | 2026-01-06 00:55:05.337904 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2026-01-06 00:55:05.337913 | orchestrator | Tuesday 06 January 2026 00:48:32 +0000 (0:00:01.746) 0:00:22.968 ******* 2026-01-06 00:55:05.337924 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.337952 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.337968 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.337979 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.337990 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338001 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338012 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338354 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.338366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338377 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.338398 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.338414 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.338426 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338436 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338446 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.338456 | orchestrator | 2026-01-06 00:55:05.338466 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2026-01-06 00:55:05.338476 | orchestrator | Tuesday 06 January 2026 00:48:33 +0000 (0:00:01.065) 0:00:24.034 ******* 2026-01-06 00:55:05.338486 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338503 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338532 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338548 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338558 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338569 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338579 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338596 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338607 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338624 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.338640 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338651 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2025.1', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531', '__omit_place_holder__761a0c396f454908b94500768c459f09b6aa6531'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-01-06 00:55:05.338662 | orchestrator | 2026-01-06 00:55:05.338672 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2026-01-06 00:55:05.338704 | orchestrator | Tuesday 06 January 2026 00:48:36 +0000 (0:00:03.515) 0:00:27.550 ******* 2026-01-06 00:55:05.338715 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338738 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338764 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338784 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338800 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338811 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.338821 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.338840 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.338865 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.338875 | orchestrator | 2026-01-06 00:55:05.338885 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2026-01-06 00:55:05.338895 | orchestrator | Tuesday 06 January 2026 00:48:41 +0000 (0:00:04.036) 0:00:31.587 ******* 2026-01-06 00:55:05.338931 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-01-06 00:55:05.338942 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-01-06 00:55:05.338952 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-01-06 00:55:05.338962 | orchestrator | 2026-01-06 00:55:05.338972 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2026-01-06 00:55:05.338982 | orchestrator | Tuesday 06 January 2026 00:48:43 +0000 (0:00:02.079) 0:00:33.666 ******* 2026-01-06 00:55:05.338992 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-01-06 00:55:05.339002 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-01-06 00:55:05.339012 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-01-06 00:55:05.339022 | orchestrator | 2026-01-06 00:55:05.341153 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2026-01-06 00:55:05.341253 | orchestrator | Tuesday 06 January 2026 00:48:48 +0000 (0:00:04.989) 0:00:38.656 ******* 2026-01-06 00:55:05.341267 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.341278 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.341287 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.341295 | orchestrator | 2026-01-06 00:55:05.341305 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2026-01-06 00:55:05.341313 | orchestrator | Tuesday 06 January 2026 00:48:49 +0000 (0:00:01.296) 0:00:39.954 ******* 2026-01-06 00:55:05.341332 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-01-06 00:55:05.341343 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-01-06 00:55:05.341351 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-01-06 00:55:05.341358 | orchestrator | 2026-01-06 00:55:05.341366 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2026-01-06 00:55:05.341374 | orchestrator | Tuesday 06 January 2026 00:48:51 +0000 (0:00:02.152) 0:00:42.106 ******* 2026-01-06 00:55:05.341383 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-01-06 00:55:05.341415 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-01-06 00:55:05.341424 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-01-06 00:55:05.341432 | orchestrator | 2026-01-06 00:55:05.341439 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-01-06 00:55:05.341447 | orchestrator | Tuesday 06 January 2026 00:48:53 +0000 (0:00:02.336) 0:00:44.443 ******* 2026-01-06 00:55:05.341455 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.341462 | orchestrator | 2026-01-06 00:55:05.341470 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2026-01-06 00:55:05.341478 | orchestrator | Tuesday 06 January 2026 00:48:54 +0000 (0:00:00.690) 0:00:45.133 ******* 2026-01-06 00:55:05.341487 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2026-01-06 00:55:05.341496 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2026-01-06 00:55:05.341504 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2026-01-06 00:55:05.341512 | orchestrator | 2026-01-06 00:55:05.341521 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2026-01-06 00:55:05.341529 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:01.720) 0:00:46.854 ******* 2026-01-06 00:55:05.341538 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2026-01-06 00:55:05.341547 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2026-01-06 00:55:05.341555 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2026-01-06 00:55:05.341563 | orchestrator | 2026-01-06 00:55:05.341572 | orchestrator | TASK [loadbalancer : Copying over proxysql-cert.pem] *************************** 2026-01-06 00:55:05.341579 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:02.873) 0:00:49.728 ******* 2026-01-06 00:55:05.341586 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.341595 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.341603 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.341611 | orchestrator | 2026-01-06 00:55:05.341620 | orchestrator | TASK [loadbalancer : Copying over proxysql-key.pem] **************************** 2026-01-06 00:55:05.341629 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:00.349) 0:00:50.078 ******* 2026-01-06 00:55:05.341636 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.341644 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.341653 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.341661 | orchestrator | 2026-01-06 00:55:05.341670 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-01-06 00:55:05.341706 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:00.342) 0:00:50.420 ******* 2026-01-06 00:55:05.341718 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341751 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341779 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341789 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341800 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341810 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.341819 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.341830 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.341847 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.341863 | orchestrator | 2026-01-06 00:55:05.341872 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-01-06 00:55:05.341880 | orchestrator | Tuesday 06 January 2026 00:49:03 +0000 (0:00:03.624) 0:00:54.045 ******* 2026-01-06 00:55:05.341892 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.341901 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.341909 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.341918 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.341928 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.341936 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.341945 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.341958 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.341976 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.341984 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.341993 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342001 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.342009 | orchestrator | 2026-01-06 00:55:05.342052 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-01-06 00:55:05.342061 | orchestrator | Tuesday 06 January 2026 00:49:04 +0000 (0:00:01.006) 0:00:55.051 ******* 2026-01-06 00:55:05.342069 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342079 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342087 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342101 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.342116 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342128 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342136 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342145 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.342153 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342162 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342169 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342184 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.342192 | orchestrator | 2026-01-06 00:55:05.342200 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2026-01-06 00:55:05.342208 | orchestrator | Tuesday 06 January 2026 00:49:05 +0000 (0:00:01.309) 0:00:56.361 ******* 2026-01-06 00:55:05.342216 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-01-06 00:55:05.342224 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-01-06 00:55:05.342232 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-01-06 00:55:05.342239 | orchestrator | 2026-01-06 00:55:05.342246 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2026-01-06 00:55:05.342254 | orchestrator | Tuesday 06 January 2026 00:49:07 +0000 (0:00:01.792) 0:00:58.154 ******* 2026-01-06 00:55:05.342261 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-01-06 00:55:05.342274 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-01-06 00:55:05.342282 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-01-06 00:55:05.342289 | orchestrator | 2026-01-06 00:55:05.342297 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2026-01-06 00:55:05.342305 | orchestrator | Tuesday 06 January 2026 00:49:09 +0000 (0:00:01.745) 0:00:59.899 ******* 2026-01-06 00:55:05.342312 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-01-06 00:55:05.342324 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-01-06 00:55:05.342331 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-01-06 00:55:05.342339 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-01-06 00:55:05.342347 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.342355 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-01-06 00:55:05.342362 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.342370 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-01-06 00:55:05.342377 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.342385 | orchestrator | 2026-01-06 00:55:05.342393 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-01-06 00:55:05.342400 | orchestrator | Tuesday 06 January 2026 00:49:10 +0000 (0:00:01.351) 0:01:01.251 ******* 2026-01-06 00:55:05.342407 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342415 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342434 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342442 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342457 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342470 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.342478 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.342486 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.342494 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.342508 | orchestrator | 2026-01-06 00:55:05.342516 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-01-06 00:55:05.342524 | orchestrator | Tuesday 06 January 2026 00:49:13 +0000 (0:00:02.903) 0:01:04.154 ******* 2026-01-06 00:55:05.342532 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:55:05.342539 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.342547 | orchestrator | } 2026-01-06 00:55:05.342555 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:55:05.342562 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.342570 | orchestrator | } 2026-01-06 00:55:05.342578 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:55:05.342586 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.342593 | orchestrator | } 2026-01-06 00:55:05.342601 | orchestrator | 2026-01-06 00:55:05.342608 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:55:05.342615 | orchestrator | Tuesday 06 January 2026 00:49:14 +0000 (0:00:00.889) 0:01:05.044 ******* 2026-01-06 00:55:05.342622 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342639 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342651 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342659 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.342664 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342669 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342697 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342706 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.342712 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.342717 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.342727 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.342732 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.342737 | orchestrator | 2026-01-06 00:55:05.342741 | orchestrator | TASK [include_role : aodh] ***************************************************** 2026-01-06 00:55:05.342749 | orchestrator | Tuesday 06 January 2026 00:49:16 +0000 (0:00:02.084) 0:01:07.128 ******* 2026-01-06 00:55:05.342754 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.342759 | orchestrator | 2026-01-06 00:55:05.342764 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2026-01-06 00:55:05.342768 | orchestrator | Tuesday 06 January 2026 00:49:17 +0000 (0:00:00.720) 0:01:07.849 ******* 2026-01-06 00:55:05.342774 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.342785 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.342791 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.342796 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342805 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.342813 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342822 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342827 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342832 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.342837 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.342847 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342855 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342860 | orchestrator | 2026-01-06 00:55:05.342868 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2026-01-06 00:55:05.342873 | orchestrator | Tuesday 06 January 2026 00:49:23 +0000 (0:00:05.817) 0:01:13.667 ******* 2026-01-06 00:55:05.342878 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.342883 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.342887 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342895 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342903 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.342916 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.342930 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.342960 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342970 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.342977 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.342985 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2025.1', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.342993 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2025.1', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.343009 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2025.1', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343024 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2025.1', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343031 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.343038 | orchestrator | 2026-01-06 00:55:05.343045 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2026-01-06 00:55:05.343053 | orchestrator | Tuesday 06 January 2026 00:49:24 +0000 (0:00:00.962) 0:01:14.629 ******* 2026-01-06 00:55:05.343060 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343069 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343075 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.343080 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343085 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343090 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.343095 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343099 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.343104 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.343109 | orchestrator | 2026-01-06 00:55:05.343113 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2026-01-06 00:55:05.343118 | orchestrator | Tuesday 06 January 2026 00:49:25 +0000 (0:00:01.191) 0:01:15.821 ******* 2026-01-06 00:55:05.343123 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.343127 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.343132 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.343136 | orchestrator | 2026-01-06 00:55:05.343141 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2026-01-06 00:55:05.343146 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:01.309) 0:01:17.131 ******* 2026-01-06 00:55:05.343150 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.343155 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.343160 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.343167 | orchestrator | 2026-01-06 00:55:05.343175 | orchestrator | TASK [include_role : barbican] ************************************************* 2026-01-06 00:55:05.343182 | orchestrator | Tuesday 06 January 2026 00:49:28 +0000 (0:00:01.992) 0:01:19.123 ******* 2026-01-06 00:55:05.343189 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.343204 | orchestrator | 2026-01-06 00:55:05.343211 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2026-01-06 00:55:05.343218 | orchestrator | Tuesday 06 January 2026 00:49:29 +0000 (0:00:00.818) 0:01:19.941 ******* 2026-01-06 00:55:05.343236 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.343245 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.343254 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343262 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343270 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343288 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343300 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.343308 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343316 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343324 | orchestrator | 2026-01-06 00:55:05.343332 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2026-01-06 00:55:05.343339 | orchestrator | Tuesday 06 January 2026 00:49:34 +0000 (0:00:05.430) 0:01:25.372 ******* 2026-01-06 00:55:05.343348 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.343366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343379 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343385 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.343390 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.343395 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343400 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.343409 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.346181 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2025.1', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.346314 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2025.1', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.346330 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2025.1', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.346341 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.346353 | orchestrator | 2026-01-06 00:55:05.346364 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2026-01-06 00:55:05.346377 | orchestrator | Tuesday 06 January 2026 00:49:36 +0000 (0:00:01.420) 0:01:26.793 ******* 2026-01-06 00:55:05.346388 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346403 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346415 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.346425 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346435 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346464 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.346475 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346485 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.346495 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.346504 | orchestrator | 2026-01-06 00:55:05.346515 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2026-01-06 00:55:05.346524 | orchestrator | Tuesday 06 January 2026 00:49:37 +0000 (0:00:01.775) 0:01:28.568 ******* 2026-01-06 00:55:05.346534 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.346544 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.346554 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.346563 | orchestrator | 2026-01-06 00:55:05.346573 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2026-01-06 00:55:05.346583 | orchestrator | Tuesday 06 January 2026 00:49:39 +0000 (0:00:01.501) 0:01:30.070 ******* 2026-01-06 00:55:05.346592 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.346602 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.346612 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.346621 | orchestrator | 2026-01-06 00:55:05.346631 | orchestrator | TASK [include_role : blazar] *************************************************** 2026-01-06 00:55:05.346641 | orchestrator | Tuesday 06 January 2026 00:49:41 +0000 (0:00:02.212) 0:01:32.282 ******* 2026-01-06 00:55:05.346651 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.346660 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.346670 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.346697 | orchestrator | 2026-01-06 00:55:05.346733 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2026-01-06 00:55:05.346743 | orchestrator | Tuesday 06 January 2026 00:49:41 +0000 (0:00:00.296) 0:01:32.579 ******* 2026-01-06 00:55:05.346753 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.346763 | orchestrator | 2026-01-06 00:55:05.346772 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2026-01-06 00:55:05.346782 | orchestrator | Tuesday 06 January 2026 00:49:42 +0000 (0:00:00.790) 0:01:33.370 ******* 2026-01-06 00:55:05.346799 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-01-06 00:55:05.346812 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-01-06 00:55:05.346830 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-01-06 00:55:05.346841 | orchestrator | 2026-01-06 00:55:05.346851 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2026-01-06 00:55:05.346862 | orchestrator | Tuesday 06 January 2026 00:49:47 +0000 (0:00:04.980) 0:01:38.350 ******* 2026-01-06 00:55:05.346873 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-01-06 00:55:05.346883 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.346947 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-01-06 00:55:05.346959 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.346969 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-01-06 00:55:05.346986 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.346996 | orchestrator | 2026-01-06 00:55:05.347005 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2026-01-06 00:55:05.347015 | orchestrator | Tuesday 06 January 2026 00:49:49 +0000 (0:00:01.941) 0:01:40.292 ******* 2026-01-06 00:55:05.347027 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347040 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347052 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.347061 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347072 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347082 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.347091 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347109 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-01-06 00:55:05.347119 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.347129 | orchestrator | 2026-01-06 00:55:05.347139 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2026-01-06 00:55:05.347149 | orchestrator | Tuesday 06 January 2026 00:49:51 +0000 (0:00:02.022) 0:01:42.314 ******* 2026-01-06 00:55:05.347163 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.347174 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.347183 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.347193 | orchestrator | 2026-01-06 00:55:05.347203 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2026-01-06 00:55:05.347213 | orchestrator | Tuesday 06 January 2026 00:49:52 +0000 (0:00:00.357) 0:01:42.672 ******* 2026-01-06 00:55:05.347230 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.347239 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.347249 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.347259 | orchestrator | 2026-01-06 00:55:05.347269 | orchestrator | TASK [include_role : cinder] *************************************************** 2026-01-06 00:55:05.347278 | orchestrator | Tuesday 06 January 2026 00:49:53 +0000 (0:00:01.154) 0:01:43.826 ******* 2026-01-06 00:55:05.347288 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.347298 | orchestrator | 2026-01-06 00:55:05.347308 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2026-01-06 00:55:05.347317 | orchestrator | Tuesday 06 January 2026 00:49:54 +0000 (0:00:00.868) 0:01:44.694 ******* 2026-01-06 00:55:05.347328 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.347340 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347353 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347372 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347388 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.347408 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347418 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347428 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347445 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.347467 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347478 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347488 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347498 | orchestrator | 2026-01-06 00:55:05.347508 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2026-01-06 00:55:05.347518 | orchestrator | Tuesday 06 January 2026 00:49:57 +0000 (0:00:03.374) 0:01:48.069 ******* 2026-01-06 00:55:05.347529 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.347540 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347589 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347614 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347638 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.347656 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.347673 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347726 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347762 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2025.1', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.347795 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347811 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.347828 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2025.1', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347839 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2025.1', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347849 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.347859 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.347869 | orchestrator | 2026-01-06 00:55:05.347879 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2026-01-06 00:55:05.347889 | orchestrator | Tuesday 06 January 2026 00:49:58 +0000 (0:00:00.598) 0:01:48.667 ******* 2026-01-06 00:55:05.347905 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347922 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347932 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.347942 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347958 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347969 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.347979 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347989 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.347999 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.348009 | orchestrator | 2026-01-06 00:55:05.348019 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2026-01-06 00:55:05.348029 | orchestrator | Tuesday 06 January 2026 00:49:58 +0000 (0:00:00.811) 0:01:49.479 ******* 2026-01-06 00:55:05.348039 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.348049 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.348058 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.348068 | orchestrator | 2026-01-06 00:55:05.348078 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2026-01-06 00:55:05.348088 | orchestrator | Tuesday 06 January 2026 00:50:00 +0000 (0:00:01.433) 0:01:50.913 ******* 2026-01-06 00:55:05.348098 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.348107 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.348117 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.348128 | orchestrator | 2026-01-06 00:55:05.348139 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2026-01-06 00:55:05.348150 | orchestrator | Tuesday 06 January 2026 00:50:02 +0000 (0:00:02.184) 0:01:53.098 ******* 2026-01-06 00:55:05.348161 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.348172 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.348183 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.348194 | orchestrator | 2026-01-06 00:55:05.348205 | orchestrator | TASK [include_role : cyborg] *************************************************** 2026-01-06 00:55:05.348216 | orchestrator | Tuesday 06 January 2026 00:50:02 +0000 (0:00:00.342) 0:01:53.440 ******* 2026-01-06 00:55:05.348227 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.348238 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.348249 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.348260 | orchestrator | 2026-01-06 00:55:05.348271 | orchestrator | TASK [include_role : designate] ************************************************ 2026-01-06 00:55:05.348282 | orchestrator | Tuesday 06 January 2026 00:50:03 +0000 (0:00:00.317) 0:01:53.758 ******* 2026-01-06 00:55:05.348293 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.348311 | orchestrator | 2026-01-06 00:55:05.348323 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2026-01-06 00:55:05.348334 | orchestrator | Tuesday 06 January 2026 00:50:04 +0000 (0:00:01.082) 0:01:54.841 ******* 2026-01-06 00:55:05.348346 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.348365 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348383 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348395 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348407 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348420 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348438 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348457 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.348475 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.348487 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348498 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348516 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348528 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348546 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348564 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348576 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348588 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348605 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348617 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348629 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348647 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348658 | orchestrator | 2026-01-06 00:55:05.348670 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2026-01-06 00:55:05.348714 | orchestrator | Tuesday 06 January 2026 00:50:08 +0000 (0:00:03.926) 0:01:58.767 ******* 2026-01-06 00:55:05.348727 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.348739 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348757 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348769 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348780 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348808 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348821 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.348832 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348850 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348862 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.348874 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348908 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348920 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348932 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.348950 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.348962 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2025.1', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.348973 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2025.1', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-01-06 00:55:05.348985 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2025.1', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.349008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2025.1', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.349020 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2025.1', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.349042 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2025.1', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.349054 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2025.1', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.349065 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.349076 | orchestrator | 2026-01-06 00:55:05.349087 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2026-01-06 00:55:05.349098 | orchestrator | Tuesday 06 January 2026 00:50:09 +0000 (0:00:00.878) 0:01:59.645 ******* 2026-01-06 00:55:05.349111 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349125 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349138 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.349149 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349160 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349172 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.349183 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349194 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.349206 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.349217 | orchestrator | 2026-01-06 00:55:05.349234 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2026-01-06 00:55:05.349245 | orchestrator | Tuesday 06 January 2026 00:50:10 +0000 (0:00:01.492) 0:02:01.137 ******* 2026-01-06 00:55:05.349257 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.349268 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.349279 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.349290 | orchestrator | 2026-01-06 00:55:05.349301 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2026-01-06 00:55:05.349312 | orchestrator | Tuesday 06 January 2026 00:50:11 +0000 (0:00:01.242) 0:02:02.380 ******* 2026-01-06 00:55:05.349330 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.349345 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.349357 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.349368 | orchestrator | 2026-01-06 00:55:05.349379 | orchestrator | TASK [include_role : etcd] ***************************************************** 2026-01-06 00:55:05.349390 | orchestrator | Tuesday 06 January 2026 00:50:13 +0000 (0:00:02.111) 0:02:04.492 ******* 2026-01-06 00:55:05.349401 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.349412 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.349423 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.349434 | orchestrator | 2026-01-06 00:55:05.349445 | orchestrator | TASK [include_role : glance] *************************************************** 2026-01-06 00:55:05.349456 | orchestrator | Tuesday 06 January 2026 00:50:14 +0000 (0:00:00.342) 0:02:04.834 ******* 2026-01-06 00:55:05.349467 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.349478 | orchestrator | 2026-01-06 00:55:05.349489 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2026-01-06 00:55:05.349500 | orchestrator | Tuesday 06 January 2026 00:50:15 +0000 (0:00:01.057) 0:02:05.891 ******* 2026-01-06 00:55:05.349513 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-01-06 00:55:05.350103 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350224 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-01-06 00:55:05.350271 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350326 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-01-06 00:55:05.350344 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350366 | orchestrator | 2026-01-06 00:55:05.350392 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2026-01-06 00:55:05.350407 | orchestrator | Tuesday 06 January 2026 00:50:20 +0000 (0:00:04.700) 0:02:10.591 ******* 2026-01-06 00:55:05.350425 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-01-06 00:55:05.350434 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350449 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.350478 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-01-06 00:55:05.350487 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350507 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-01-06 00:55:05.350523 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.350530 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2025.1', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.350538 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.350545 | orchestrator | 2026-01-06 00:55:05.350552 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2026-01-06 00:55:05.350559 | orchestrator | Tuesday 06 January 2026 00:50:23 +0000 (0:00:03.153) 0:02:13.744 ******* 2026-01-06 00:55:05.350567 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350587 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350596 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.350607 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350614 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350622 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.350629 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350636 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-01-06 00:55:05.350643 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.350650 | orchestrator | 2026-01-06 00:55:05.350659 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2026-01-06 00:55:05.350670 | orchestrator | Tuesday 06 January 2026 00:50:27 +0000 (0:00:03.849) 0:02:17.594 ******* 2026-01-06 00:55:05.350735 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.350743 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.350750 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.350757 | orchestrator | 2026-01-06 00:55:05.350764 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2026-01-06 00:55:05.350772 | orchestrator | Tuesday 06 January 2026 00:50:28 +0000 (0:00:01.329) 0:02:18.923 ******* 2026-01-06 00:55:05.350791 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.350802 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.350812 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.350824 | orchestrator | 2026-01-06 00:55:05.350835 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2026-01-06 00:55:05.350846 | orchestrator | Tuesday 06 January 2026 00:50:30 +0000 (0:00:02.206) 0:02:21.130 ******* 2026-01-06 00:55:05.350856 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.350863 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.350870 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.350877 | orchestrator | 2026-01-06 00:55:05.350883 | orchestrator | TASK [include_role : grafana] ************************************************** 2026-01-06 00:55:05.350890 | orchestrator | Tuesday 06 January 2026 00:50:30 +0000 (0:00:00.350) 0:02:21.481 ******* 2026-01-06 00:55:05.350897 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.350903 | orchestrator | 2026-01-06 00:55:05.350910 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2026-01-06 00:55:05.350917 | orchestrator | Tuesday 06 January 2026 00:50:31 +0000 (0:00:00.866) 0:02:22.348 ******* 2026-01-06 00:55:05.350932 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.350950 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.350963 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.350974 | orchestrator | 2026-01-06 00:55:05.350985 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2026-01-06 00:55:05.350992 | orchestrator | Tuesday 06 January 2026 00:50:36 +0000 (0:00:04.270) 0:02:26.618 ******* 2026-01-06 00:55:05.351000 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.351012 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.351020 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351027 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351041 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.351049 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351056 | orchestrator | 2026-01-06 00:55:05.351063 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2026-01-06 00:55:05.351073 | orchestrator | Tuesday 06 January 2026 00:50:36 +0000 (0:00:00.425) 0:02:27.043 ******* 2026-01-06 00:55:05.351081 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351090 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351098 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351104 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351111 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351118 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351132 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.351145 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351151 | orchestrator | 2026-01-06 00:55:05.351158 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2026-01-06 00:55:05.351165 | orchestrator | Tuesday 06 January 2026 00:50:37 +0000 (0:00:00.669) 0:02:27.713 ******* 2026-01-06 00:55:05.351172 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.351179 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.351185 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.351197 | orchestrator | 2026-01-06 00:55:05.351207 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2026-01-06 00:55:05.351219 | orchestrator | Tuesday 06 January 2026 00:50:38 +0000 (0:00:01.599) 0:02:29.312 ******* 2026-01-06 00:55:05.351231 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.351242 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.351253 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.351261 | orchestrator | 2026-01-06 00:55:05.351267 | orchestrator | TASK [include_role : heat] ***************************************************** 2026-01-06 00:55:05.351274 | orchestrator | Tuesday 06 January 2026 00:50:40 +0000 (0:00:02.139) 0:02:31.451 ******* 2026-01-06 00:55:05.351281 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351287 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351294 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351301 | orchestrator | 2026-01-06 00:55:05.351310 | orchestrator | TASK [include_role : horizon] ************************************************** 2026-01-06 00:55:05.351321 | orchestrator | Tuesday 06 January 2026 00:50:41 +0000 (0:00:00.336) 0:02:31.787 ******* 2026-01-06 00:55:05.351332 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.351343 | orchestrator | 2026-01-06 00:55:05.351354 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2026-01-06 00:55:05.351366 | orchestrator | Tuesday 06 January 2026 00:50:42 +0000 (0:00:01.031) 0:02:32.819 ******* 2026-01-06 00:55:05.351393 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 00:55:05.351416 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 00:55:05.351442 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 00:55:05.351459 | orchestrator | 2026-01-06 00:55:05.351466 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2026-01-06 00:55:05.351473 | orchestrator | Tuesday 06 January 2026 00:50:48 +0000 (0:00:06.052) 0:02:38.872 ******* 2026-01-06 00:55:05.351486 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 00:55:05.351494 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351506 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 00:55:05.351521 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351534 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 00:55:05.351545 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351552 | orchestrator | 2026-01-06 00:55:05.351559 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2026-01-06 00:55:05.351566 | orchestrator | Tuesday 06 January 2026 00:50:48 +0000 (0:00:00.673) 0:02:39.546 ******* 2026-01-06 00:55:05.351579 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351588 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351598 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351607 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351614 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351621 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351628 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351635 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351642 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-01-06 00:55:05.351651 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351658 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-01-06 00:55:05.351665 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351701 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351711 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351726 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-01-06 00:55:05.351733 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-01-06 00:55:05.351740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-01-06 00:55:05.351747 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351754 | orchestrator | 2026-01-06 00:55:05.351761 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2026-01-06 00:55:05.351768 | orchestrator | Tuesday 06 January 2026 00:50:50 +0000 (0:00:01.348) 0:02:40.894 ******* 2026-01-06 00:55:05.351775 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.351781 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.351788 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.351795 | orchestrator | 2026-01-06 00:55:05.351802 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2026-01-06 00:55:05.351809 | orchestrator | Tuesday 06 January 2026 00:50:52 +0000 (0:00:01.827) 0:02:42.721 ******* 2026-01-06 00:55:05.351817 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.351829 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.351840 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.351851 | orchestrator | 2026-01-06 00:55:05.351863 | orchestrator | TASK [include_role : influxdb] ************************************************* 2026-01-06 00:55:05.351873 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:02.222) 0:02:44.944 ******* 2026-01-06 00:55:05.351885 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351893 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351904 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351914 | orchestrator | 2026-01-06 00:55:05.351925 | orchestrator | TASK [include_role : ironic] *************************************************** 2026-01-06 00:55:05.351936 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.561) 0:02:45.505 ******* 2026-01-06 00:55:05.351947 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.351955 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.351961 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.351968 | orchestrator | 2026-01-06 00:55:05.351975 | orchestrator | TASK [include_role : keystone] ************************************************* 2026-01-06 00:55:05.351982 | orchestrator | Tuesday 06 January 2026 00:50:55 +0000 (0:00:00.703) 0:02:46.208 ******* 2026-01-06 00:55:05.351988 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.351995 | orchestrator | 2026-01-06 00:55:05.352001 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2026-01-06 00:55:05.352008 | orchestrator | Tuesday 06 January 2026 00:50:57 +0000 (0:00:02.116) 0:02:48.325 ******* 2026-01-06 00:55:05.352018 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:55:05.352046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352066 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352079 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:55:05.352092 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352100 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352120 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:55:05.352131 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352138 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352145 | orchestrator | 2026-01-06 00:55:05.352152 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2026-01-06 00:55:05.352159 | orchestrator | Tuesday 06 January 2026 00:51:02 +0000 (0:00:04.273) 0:02:52.599 ******* 2026-01-06 00:55:05.352167 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:55:05.352174 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352186 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352193 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.352210 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:55:05.352218 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352226 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352233 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.352240 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:55:05.352252 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:55:05.352479 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:55:05.352498 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.352505 | orchestrator | 2026-01-06 00:55:05.352518 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2026-01-06 00:55:05.352525 | orchestrator | Tuesday 06 January 2026 00:51:02 +0000 (0:00:00.689) 0:02:53.288 ******* 2026-01-06 00:55:05.352533 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352540 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352548 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.352555 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352569 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.352582 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352594 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-01-06 00:55:05.352616 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.352628 | orchestrator | 2026-01-06 00:55:05.352637 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2026-01-06 00:55:05.352644 | orchestrator | Tuesday 06 January 2026 00:51:03 +0000 (0:00:00.823) 0:02:54.112 ******* 2026-01-06 00:55:05.352651 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.352658 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.352664 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.352671 | orchestrator | 2026-01-06 00:55:05.352706 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2026-01-06 00:55:05.352714 | orchestrator | Tuesday 06 January 2026 00:51:04 +0000 (0:00:01.191) 0:02:55.303 ******* 2026-01-06 00:55:05.352720 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.352727 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.352734 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.352741 | orchestrator | 2026-01-06 00:55:05.352747 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2026-01-06 00:55:05.352754 | orchestrator | Tuesday 06 January 2026 00:51:06 +0000 (0:00:01.976) 0:02:57.279 ******* 2026-01-06 00:55:05.352761 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.352768 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.352774 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.352781 | orchestrator | 2026-01-06 00:55:05.352788 | orchestrator | TASK [include_role : magnum] *************************************************** 2026-01-06 00:55:05.352800 | orchestrator | Tuesday 06 January 2026 00:51:07 +0000 (0:00:00.319) 0:02:57.599 ******* 2026-01-06 00:55:05.352810 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.352828 | orchestrator | 2026-01-06 00:55:05.352841 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2026-01-06 00:55:05.352853 | orchestrator | Tuesday 06 January 2026 00:51:08 +0000 (0:00:01.294) 0:02:58.894 ******* 2026-01-06 00:55:05.352950 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.352965 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.352973 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.352990 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.352998 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.353069 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353130 | orchestrator | 2026-01-06 00:55:05.353139 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2026-01-06 00:55:05.353147 | orchestrator | Tuesday 06 January 2026 00:51:12 +0000 (0:00:04.091) 0:03:02.985 ******* 2026-01-06 00:55:05.353155 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.353169 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353177 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.353184 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.353259 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353275 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.353291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2025.1', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.353312 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2025.1', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353324 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.353335 | orchestrator | 2026-01-06 00:55:05.353345 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2026-01-06 00:55:05.353351 | orchestrator | Tuesday 06 January 2026 00:51:12 +0000 (0:00:00.588) 0:03:03.574 ******* 2026-01-06 00:55:05.353359 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353373 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.353380 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353387 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353394 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.353401 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353408 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.353418 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.353429 | orchestrator | 2026-01-06 00:55:05.353441 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2026-01-06 00:55:05.353452 | orchestrator | Tuesday 06 January 2026 00:51:13 +0000 (0:00:00.778) 0:03:04.352 ******* 2026-01-06 00:55:05.353544 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.353561 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.353573 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.353583 | orchestrator | 2026-01-06 00:55:05.353595 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2026-01-06 00:55:05.353607 | orchestrator | Tuesday 06 January 2026 00:51:15 +0000 (0:00:01.445) 0:03:05.798 ******* 2026-01-06 00:55:05.353617 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.353636 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.353643 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.353649 | orchestrator | 2026-01-06 00:55:05.353656 | orchestrator | TASK [include_role : manila] *************************************************** 2026-01-06 00:55:05.353667 | orchestrator | Tuesday 06 January 2026 00:51:17 +0000 (0:00:02.133) 0:03:07.931 ******* 2026-01-06 00:55:05.353693 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.353703 | orchestrator | 2026-01-06 00:55:05.353714 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2026-01-06 00:55:05.353725 | orchestrator | Tuesday 06 January 2026 00:51:18 +0000 (0:00:01.346) 0:03:09.278 ******* 2026-01-06 00:55:05.353738 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.353752 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.353764 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353775 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353860 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353891 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353904 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353915 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.353925 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.353937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354075 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354107 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354120 | orchestrator | 2026-01-06 00:55:05.354132 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2026-01-06 00:55:05.354144 | orchestrator | Tuesday 06 January 2026 00:51:23 +0000 (0:00:05.223) 0:03:14.502 ******* 2026-01-06 00:55:05.354180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.354195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354208 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354220 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354240 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.354350 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.354369 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354381 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354393 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354404 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.354416 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.354500 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354522 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354530 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.354537 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.354544 | orchestrator | 2026-01-06 00:55:05.354551 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2026-01-06 00:55:05.354558 | orchestrator | Tuesday 06 January 2026 00:51:25 +0000 (0:00:01.305) 0:03:15.807 ******* 2026-01-06 00:55:05.354565 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354578 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354586 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354593 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.354600 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354607 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354614 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.354631 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.354638 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.354646 | orchestrator | 2026-01-06 00:55:05.354656 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2026-01-06 00:55:05.354732 | orchestrator | Tuesday 06 January 2026 00:51:26 +0000 (0:00:00.976) 0:03:16.784 ******* 2026-01-06 00:55:05.354743 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.354753 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.354765 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.354772 | orchestrator | 2026-01-06 00:55:05.354779 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2026-01-06 00:55:05.354786 | orchestrator | Tuesday 06 January 2026 00:51:27 +0000 (0:00:01.349) 0:03:18.134 ******* 2026-01-06 00:55:05.354793 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.354799 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.354806 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.354813 | orchestrator | 2026-01-06 00:55:05.354819 | orchestrator | TASK [include_role : mariadb] ************************************************** 2026-01-06 00:55:05.354826 | orchestrator | Tuesday 06 January 2026 00:51:29 +0000 (0:00:02.420) 0:03:20.554 ******* 2026-01-06 00:55:05.354892 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.354902 | orchestrator | 2026-01-06 00:55:05.354909 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2026-01-06 00:55:05.354915 | orchestrator | Tuesday 06 January 2026 00:51:31 +0000 (0:00:01.710) 0:03:22.264 ******* 2026-01-06 00:55:05.354923 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-01-06 00:55:05.354929 | orchestrator | 2026-01-06 00:55:05.354936 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2026-01-06 00:55:05.354943 | orchestrator | Tuesday 06 January 2026 00:51:35 +0000 (0:00:03.359) 0:03:25.624 ******* 2026-01-06 00:55:05.354957 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.354973 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.354980 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.355039 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.355051 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.355058 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.355066 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.355078 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.355085 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.355093 | orchestrator | 2026-01-06 00:55:05.355105 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2026-01-06 00:55:05.355116 | orchestrator | Tuesday 06 January 2026 00:51:37 +0000 (0:00:01.983) 0:03:27.607 ******* 2026-01-06 00:55:05.355281 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.355311 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.355331 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.355345 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.355423 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.355449 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.355457 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:55:05.355471 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2025.1', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-01-06 00:55:05.355478 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.355484 | orchestrator | 2026-01-06 00:55:05.355491 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2026-01-06 00:55:05.355498 | orchestrator | Tuesday 06 January 2026 00:51:39 +0000 (0:00:02.144) 0:03:29.752 ******* 2026-01-06 00:55:05.355506 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355578 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355595 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.355613 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355625 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355644 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.355651 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355659 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-01-06 00:55:05.355666 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.355672 | orchestrator | 2026-01-06 00:55:05.355708 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2026-01-06 00:55:05.355720 | orchestrator | Tuesday 06 January 2026 00:51:41 +0000 (0:00:02.705) 0:03:32.458 ******* 2026-01-06 00:55:05.355731 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.355741 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.355751 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.355762 | orchestrator | 2026-01-06 00:55:05.355769 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2026-01-06 00:55:05.355776 | orchestrator | Tuesday 06 January 2026 00:51:43 +0000 (0:00:02.093) 0:03:34.552 ******* 2026-01-06 00:55:05.355783 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.355789 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.355796 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.355803 | orchestrator | 2026-01-06 00:55:05.355810 | orchestrator | TASK [include_role : masakari] ************************************************* 2026-01-06 00:55:05.355816 | orchestrator | Tuesday 06 January 2026 00:51:46 +0000 (0:00:02.055) 0:03:36.607 ******* 2026-01-06 00:55:05.355823 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.355830 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.355837 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.355843 | orchestrator | 2026-01-06 00:55:05.355850 | orchestrator | TASK [include_role : memcached] ************************************************ 2026-01-06 00:55:05.355857 | orchestrator | Tuesday 06 January 2026 00:51:46 +0000 (0:00:00.354) 0:03:36.961 ******* 2026-01-06 00:55:05.355863 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.355870 | orchestrator | 2026-01-06 00:55:05.355877 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2026-01-06 00:55:05.355884 | orchestrator | Tuesday 06 January 2026 00:51:48 +0000 (0:00:01.700) 0:03:38.662 ******* 2026-01-06 00:55:05.355986 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:55:05.356008 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:55:05.356016 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-01-06 00:55:05.356023 | orchestrator | 2026-01-06 00:55:05.356030 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2026-01-06 00:55:05.356037 | orchestrator | Tuesday 06 January 2026 00:51:49 +0000 (0:00:01.492) 0:03:40.155 ******* 2026-01-06 00:55:05.356044 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:55:05.356051 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.356058 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:55:05.356065 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.356125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2025.1', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-01-06 00:55:05.356140 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.356147 | orchestrator | 2026-01-06 00:55:05.356153 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2026-01-06 00:55:05.356160 | orchestrator | Tuesday 06 January 2026 00:51:50 +0000 (0:00:00.455) 0:03:40.610 ******* 2026-01-06 00:55:05.356171 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-01-06 00:55:05.356184 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.356196 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-01-06 00:55:05.356208 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.356219 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-01-06 00:55:05.356230 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.356240 | orchestrator | 2026-01-06 00:55:05.356252 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2026-01-06 00:55:05.356263 | orchestrator | Tuesday 06 January 2026 00:51:51 +0000 (0:00:01.018) 0:03:41.629 ******* 2026-01-06 00:55:05.356274 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.356285 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.356296 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.356307 | orchestrator | 2026-01-06 00:55:05.356319 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2026-01-06 00:55:05.356331 | orchestrator | Tuesday 06 January 2026 00:51:51 +0000 (0:00:00.466) 0:03:42.096 ******* 2026-01-06 00:55:05.356342 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.356354 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.356366 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.356378 | orchestrator | 2026-01-06 00:55:05.356390 | orchestrator | TASK [include_role : mistral] ************************************************** 2026-01-06 00:55:05.356401 | orchestrator | Tuesday 06 January 2026 00:51:52 +0000 (0:00:01.474) 0:03:43.571 ******* 2026-01-06 00:55:05.356409 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.356420 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.356431 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.356442 | orchestrator | 2026-01-06 00:55:05.356453 | orchestrator | TASK [include_role : neutron] ************************************************** 2026-01-06 00:55:05.356484 | orchestrator | Tuesday 06 January 2026 00:51:53 +0000 (0:00:00.411) 0:03:43.983 ******* 2026-01-06 00:55:05.356495 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.356506 | orchestrator | 2026-01-06 00:55:05.356519 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2026-01-06 00:55:05.356531 | orchestrator | Tuesday 06 January 2026 00:51:55 +0000 (0:00:01.678) 0:03:45.662 ******* 2026-01-06 00:55:05.356544 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.356656 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.356694 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.356706 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.356720 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.356743 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.356829 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.356852 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.356865 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.356877 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.356889 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.356901 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.356925 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357014 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.357032 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.357044 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.357056 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357159 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.357185 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.357197 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357209 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.357230 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.357306 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357321 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.357330 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357345 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357357 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357413 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.357427 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357435 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.357442 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.357449 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.357484 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.357558 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357580 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357594 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357606 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.357626 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357637 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.357747 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357771 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.357784 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.357797 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.357817 | orchestrator | 2026-01-06 00:55:05.357829 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2026-01-06 00:55:05.357836 | orchestrator | Tuesday 06 January 2026 00:51:59 +0000 (0:00:04.736) 0:03:50.398 ******* 2026-01-06 00:55:05.357845 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.357923 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357939 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.357952 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.357971 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.357983 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.357995 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.358343 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.358393 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.358406 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358435 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.358445 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.358456 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358558 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.358577 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.358588 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.358599 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.358620 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358731 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.358758 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.358773 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358796 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.358810 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.358824 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.358838 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.358937 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2025.1', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.358960 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2025.1', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.358994 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-dhcp-agent:2025.1', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-01-06 00:55:05.359008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.359114 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/neutron-l3-agent:2025.1', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-01-06 00:55:05.359135 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.359159 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.359172 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.359185 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2025.1', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.359291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.359320 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.359336 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.359359 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-01-06 00:55:05.359373 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.359385 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2025.1', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.359397 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.359415 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-01-06 00:55:05.359513 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2025.1', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-01-06 00:55:05.359541 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2025.1', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.359564 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2025.1', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-01-06 00:55:05.359580 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2025.1', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-01-06 00:55:05.359595 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.359609 | orchestrator | 2026-01-06 00:55:05.359623 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2026-01-06 00:55:05.359638 | orchestrator | Tuesday 06 January 2026 00:52:02 +0000 (0:00:02.326) 0:03:52.725 ******* 2026-01-06 00:55:05.359653 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359668 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359733 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.359747 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359760 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359774 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.359787 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359893 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.359913 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.359945 | orchestrator | 2026-01-06 00:55:05.359959 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2026-01-06 00:55:05.359972 | orchestrator | Tuesday 06 January 2026 00:52:04 +0000 (0:00:02.770) 0:03:55.496 ******* 2026-01-06 00:55:05.359986 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.360005 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.360020 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.360032 | orchestrator | 2026-01-06 00:55:05.360045 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2026-01-06 00:55:05.360056 | orchestrator | Tuesday 06 January 2026 00:52:06 +0000 (0:00:01.319) 0:03:56.816 ******* 2026-01-06 00:55:05.360069 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.360081 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.360094 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.360107 | orchestrator | 2026-01-06 00:55:05.360121 | orchestrator | TASK [include_role : placement] ************************************************ 2026-01-06 00:55:05.360134 | orchestrator | Tuesday 06 January 2026 00:52:08 +0000 (0:00:02.198) 0:03:59.014 ******* 2026-01-06 00:55:05.360147 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.360160 | orchestrator | 2026-01-06 00:55:05.360174 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2026-01-06 00:55:05.360188 | orchestrator | Tuesday 06 January 2026 00:52:10 +0000 (0:00:01.597) 0:04:00.612 ******* 2026-01-06 00:55:05.360203 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.360220 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.360334 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.360370 | orchestrator | 2026-01-06 00:55:05.360384 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2026-01-06 00:55:05.360398 | orchestrator | Tuesday 06 January 2026 00:52:14 +0000 (0:00:04.096) 0:04:04.708 ******* 2026-01-06 00:55:05.360419 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.360434 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.360447 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.360459 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.360471 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.360492 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.360504 | orchestrator | 2026-01-06 00:55:05.360518 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2026-01-06 00:55:05.360530 | orchestrator | Tuesday 06 January 2026 00:52:14 +0000 (0:00:00.503) 0:04:05.212 ******* 2026-01-06 00:55:05.360543 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360650 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360670 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.360717 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360732 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360746 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.360759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360770 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.360782 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.360794 | orchestrator | 2026-01-06 00:55:05.360807 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2026-01-06 00:55:05.360819 | orchestrator | Tuesday 06 January 2026 00:52:15 +0000 (0:00:01.233) 0:04:06.445 ******* 2026-01-06 00:55:05.360830 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.360842 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.360856 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.360868 | orchestrator | 2026-01-06 00:55:05.360881 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2026-01-06 00:55:05.360894 | orchestrator | Tuesday 06 January 2026 00:52:17 +0000 (0:00:01.271) 0:04:07.717 ******* 2026-01-06 00:55:05.360907 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.360920 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.360932 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.360946 | orchestrator | 2026-01-06 00:55:05.360957 | orchestrator | TASK [include_role : nova] ***************************************************** 2026-01-06 00:55:05.360970 | orchestrator | Tuesday 06 January 2026 00:52:19 +0000 (0:00:01.892) 0:04:09.609 ******* 2026-01-06 00:55:05.360982 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.360994 | orchestrator | 2026-01-06 00:55:05.361006 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2026-01-06 00:55:05.361017 | orchestrator | Tuesday 06 January 2026 00:52:20 +0000 (0:00:01.197) 0:04:10.807 ******* 2026-01-06 00:55:05.361031 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361135 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361155 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361165 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361188 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361250 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361262 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361270 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361278 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.361292 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361322 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361331 | orchestrator | 2026-01-06 00:55:05.361339 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2026-01-06 00:55:05.361347 | orchestrator | Tuesday 06 January 2026 00:52:25 +0000 (0:00:05.756) 0:04:16.564 ******* 2026-01-06 00:55:05.361359 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361368 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361381 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361413 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361427 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361435 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361443 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.361451 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361464 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361472 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.361480 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361521 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/nova-api:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.361539 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2025.1', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361553 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2025.1', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.361573 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.361585 | orchestrator | 2026-01-06 00:55:05.361596 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2026-01-06 00:55:05.361607 | orchestrator | Tuesday 06 January 2026 00:52:27 +0000 (0:00:01.149) 0:04:17.713 ******* 2026-01-06 00:55:05.361621 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361634 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361647 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361658 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361670 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361735 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.361750 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361761 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361817 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361833 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361845 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361857 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.361870 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.361923 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.362082 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362102 | orchestrator | 2026-01-06 00:55:05.362115 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2026-01-06 00:55:05.362129 | orchestrator | Tuesday 06 January 2026 00:52:28 +0000 (0:00:01.264) 0:04:18.978 ******* 2026-01-06 00:55:05.362143 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.362155 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.362168 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.362179 | orchestrator | 2026-01-06 00:55:05.362192 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2026-01-06 00:55:05.362205 | orchestrator | Tuesday 06 January 2026 00:52:30 +0000 (0:00:01.652) 0:04:20.630 ******* 2026-01-06 00:55:05.362217 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.362230 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.362241 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.362249 | orchestrator | 2026-01-06 00:55:05.362257 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2026-01-06 00:55:05.362264 | orchestrator | Tuesday 06 January 2026 00:52:32 +0000 (0:00:02.220) 0:04:22.850 ******* 2026-01-06 00:55:05.362273 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.362280 | orchestrator | 2026-01-06 00:55:05.362287 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2026-01-06 00:55:05.362295 | orchestrator | Tuesday 06 January 2026 00:52:33 +0000 (0:00:01.387) 0:04:24.238 ******* 2026-01-06 00:55:05.362303 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2026-01-06 00:55:05.362312 | orchestrator | 2026-01-06 00:55:05.362320 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2026-01-06 00:55:05.362327 | orchestrator | Tuesday 06 January 2026 00:52:35 +0000 (0:00:01.636) 0:04:25.875 ******* 2026-01-06 00:55:05.362335 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-01-06 00:55:05.362345 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-01-06 00:55:05.362402 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-01-06 00:55:05.362412 | orchestrator | 2026-01-06 00:55:05.362421 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2026-01-06 00:55:05.362429 | orchestrator | Tuesday 06 January 2026 00:52:40 +0000 (0:00:04.912) 0:04:30.787 ******* 2026-01-06 00:55:05.362456 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362464 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.362472 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362479 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.362487 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362495 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362502 | orchestrator | 2026-01-06 00:55:05.362510 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2026-01-06 00:55:05.362517 | orchestrator | Tuesday 06 January 2026 00:52:42 +0000 (0:00:02.034) 0:04:32.821 ******* 2026-01-06 00:55:05.362525 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362533 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362541 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362549 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362557 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362564 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.362572 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362580 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-01-06 00:55:05.362587 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.362595 | orchestrator | 2026-01-06 00:55:05.362602 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-01-06 00:55:05.362614 | orchestrator | Tuesday 06 January 2026 00:52:44 +0000 (0:00:02.227) 0:04:35.049 ******* 2026-01-06 00:55:05.362621 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.362629 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.362636 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.362643 | orchestrator | 2026-01-06 00:55:05.362650 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-01-06 00:55:05.362708 | orchestrator | Tuesday 06 January 2026 00:52:47 +0000 (0:00:02.534) 0:04:37.583 ******* 2026-01-06 00:55:05.362720 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.362728 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.362735 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.362742 | orchestrator | 2026-01-06 00:55:05.362750 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2026-01-06 00:55:05.362756 | orchestrator | Tuesday 06 January 2026 00:52:49 +0000 (0:00:02.670) 0:04:40.254 ******* 2026-01-06 00:55:05.362769 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2026-01-06 00:55:05.362776 | orchestrator | 2026-01-06 00:55:05.362783 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2026-01-06 00:55:05.362790 | orchestrator | Tuesday 06 January 2026 00:52:50 +0000 (0:00:00.755) 0:04:41.009 ******* 2026-01-06 00:55:05.362797 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362805 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.362812 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362819 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.362826 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362833 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362840 | orchestrator | 2026-01-06 00:55:05.362847 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2026-01-06 00:55:05.362854 | orchestrator | Tuesday 06 January 2026 00:52:51 +0000 (0:00:01.396) 0:04:42.405 ******* 2026-01-06 00:55:05.362861 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362873 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.362880 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362887 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.362917 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-01-06 00:55:05.362926 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362933 | orchestrator | 2026-01-06 00:55:05.362940 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2026-01-06 00:55:05.362947 | orchestrator | Tuesday 06 January 2026 00:52:53 +0000 (0:00:01.610) 0:04:44.016 ******* 2026-01-06 00:55:05.362953 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.362960 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.362967 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.362974 | orchestrator | 2026-01-06 00:55:05.362984 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-01-06 00:55:05.362991 | orchestrator | Tuesday 06 January 2026 00:52:55 +0000 (0:00:01.700) 0:04:45.716 ******* 2026-01-06 00:55:05.362998 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.363006 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.363012 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.363019 | orchestrator | 2026-01-06 00:55:05.363026 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-01-06 00:55:05.363033 | orchestrator | Tuesday 06 January 2026 00:52:57 +0000 (0:00:02.554) 0:04:48.271 ******* 2026-01-06 00:55:05.363039 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.363046 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.363053 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.363060 | orchestrator | 2026-01-06 00:55:05.363066 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2026-01-06 00:55:05.363073 | orchestrator | Tuesday 06 January 2026 00:53:00 +0000 (0:00:03.232) 0:04:51.503 ******* 2026-01-06 00:55:05.363080 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2026-01-06 00:55:05.363087 | orchestrator | 2026-01-06 00:55:05.363094 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2026-01-06 00:55:05.363101 | orchestrator | Tuesday 06 January 2026 00:53:01 +0000 (0:00:00.877) 0:04:52.380 ******* 2026-01-06 00:55:05.363108 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363115 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.363122 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363135 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.363142 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363149 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.363156 | orchestrator | 2026-01-06 00:55:05.363209 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2026-01-06 00:55:05.363217 | orchestrator | Tuesday 06 January 2026 00:53:03 +0000 (0:00:01.559) 0:04:53.940 ******* 2026-01-06 00:55:05.363224 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363232 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.363263 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363271 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.363282 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-01-06 00:55:05.363290 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.363297 | orchestrator | 2026-01-06 00:55:05.363304 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2026-01-06 00:55:05.363311 | orchestrator | Tuesday 06 January 2026 00:53:04 +0000 (0:00:01.363) 0:04:55.303 ******* 2026-01-06 00:55:05.363318 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.363324 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.363331 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.363338 | orchestrator | 2026-01-06 00:55:05.363345 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-01-06 00:55:05.363353 | orchestrator | Tuesday 06 January 2026 00:53:06 +0000 (0:00:01.605) 0:04:56.909 ******* 2026-01-06 00:55:05.363360 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.363372 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.363380 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.363387 | orchestrator | 2026-01-06 00:55:05.363394 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-01-06 00:55:05.363401 | orchestrator | Tuesday 06 January 2026 00:53:09 +0000 (0:00:02.814) 0:04:59.723 ******* 2026-01-06 00:55:05.363407 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.363414 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.363421 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.363428 | orchestrator | 2026-01-06 00:55:05.363435 | orchestrator | TASK [include_role : octavia] ************************************************** 2026-01-06 00:55:05.363441 | orchestrator | Tuesday 06 January 2026 00:53:12 +0000 (0:00:02.983) 0:05:02.706 ******* 2026-01-06 00:55:05.363448 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.363455 | orchestrator | 2026-01-06 00:55:05.363462 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2026-01-06 00:55:05.363469 | orchestrator | Tuesday 06 January 2026 00:53:13 +0000 (0:00:01.820) 0:05:04.527 ******* 2026-01-06 00:55:05.363476 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-01-06 00:55:05.363485 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-01-06 00:55:05.363514 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363524 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363536 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363543 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363550 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363557 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363583 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363599 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363607 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-01-06 00:55:05.363619 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363626 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363633 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363641 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363648 | orchestrator | 2026-01-06 00:55:05.363673 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2026-01-06 00:55:05.363702 | orchestrator | Tuesday 06 January 2026 00:53:17 +0000 (0:00:04.013) 0:05:08.540 ******* 2026-01-06 00:55:05.363713 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-01-06 00:55:05.363726 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363733 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363740 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363748 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363755 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.363783 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-01-06 00:55:05.363800 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363808 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363815 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363822 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363829 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.363837 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2025.1', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-01-06 00:55:05.363862 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2025.1', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-01-06 00:55:05.363881 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2025.1', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363889 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2025.1', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-01-06 00:55:05.363896 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2025.1', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-01-06 00:55:05.363903 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.363910 | orchestrator | 2026-01-06 00:55:05.363917 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2026-01-06 00:55:05.363924 | orchestrator | Tuesday 06 January 2026 00:53:19 +0000 (0:00:01.633) 0:05:10.174 ******* 2026-01-06 00:55:05.363931 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363938 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363946 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.363953 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363960 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363967 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.363974 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-01-06 00:55:05.363988 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.363995 | orchestrator | 2026-01-06 00:55:05.364001 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2026-01-06 00:55:05.364014 | orchestrator | Tuesday 06 January 2026 00:53:20 +0000 (0:00:01.001) 0:05:11.175 ******* 2026-01-06 00:55:05.364021 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.364028 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.364034 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.364041 | orchestrator | 2026-01-06 00:55:05.364048 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2026-01-06 00:55:05.364055 | orchestrator | Tuesday 06 January 2026 00:53:21 +0000 (0:00:01.290) 0:05:12.465 ******* 2026-01-06 00:55:05.364082 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.364090 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.364097 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.364104 | orchestrator | 2026-01-06 00:55:05.364111 | orchestrator | TASK [include_role : opensearch] *********************************************** 2026-01-06 00:55:05.364117 | orchestrator | Tuesday 06 January 2026 00:53:24 +0000 (0:00:02.230) 0:05:14.696 ******* 2026-01-06 00:55:05.364124 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.364131 | orchestrator | 2026-01-06 00:55:05.364138 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2026-01-06 00:55:05.364149 | orchestrator | Tuesday 06 January 2026 00:53:25 +0000 (0:00:01.763) 0:05:16.459 ******* 2026-01-06 00:55:05.364156 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.364165 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.364173 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.364220 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:55:05.364235 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:55:05.364243 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:55:05.364251 | orchestrator | 2026-01-06 00:55:05.364258 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2026-01-06 00:55:05.364264 | orchestrator | Tuesday 06 January 2026 00:53:31 +0000 (0:00:05.469) 0:05:21.929 ******* 2026-01-06 00:55:05.364272 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.364306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:55:05.364316 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.364323 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.364331 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:55:05.364343 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.364350 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.364383 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:55:05.364392 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.364399 | orchestrator | 2026-01-06 00:55:05.364406 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2026-01-06 00:55:05.364413 | orchestrator | Tuesday 06 January 2026 00:53:32 +0000 (0:00:00.671) 0:05:22.600 ******* 2026-01-06 00:55:05.364420 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.364428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364436 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364443 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.364450 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.364458 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364471 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364479 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.364486 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.364493 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364500 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-01-06 00:55:05.364507 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.364514 | orchestrator | 2026-01-06 00:55:05.364520 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2026-01-06 00:55:05.364528 | orchestrator | Tuesday 06 January 2026 00:53:33 +0000 (0:00:01.752) 0:05:24.353 ******* 2026-01-06 00:55:05.364534 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.364541 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.364548 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.364554 | orchestrator | 2026-01-06 00:55:05.364561 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2026-01-06 00:55:05.364568 | orchestrator | Tuesday 06 January 2026 00:53:34 +0000 (0:00:00.466) 0:05:24.820 ******* 2026-01-06 00:55:05.364603 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.364612 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.364619 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.364626 | orchestrator | 2026-01-06 00:55:05.364633 | orchestrator | TASK [include_role : prometheus] *********************************************** 2026-01-06 00:55:05.364640 | orchestrator | Tuesday 06 January 2026 00:53:35 +0000 (0:00:01.373) 0:05:26.194 ******* 2026-01-06 00:55:05.364646 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.364653 | orchestrator | 2026-01-06 00:55:05.364660 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2026-01-06 00:55:05.364729 | orchestrator | Tuesday 06 January 2026 00:53:37 +0000 (0:00:01.988) 0:05:28.182 ******* 2026-01-06 00:55:05.364740 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 00:55:05.364754 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.364763 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364771 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364778 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.364812 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 00:55:05.364822 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 00:55:05.364835 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.364842 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.364849 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364857 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364883 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364896 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364904 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.364916 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.364924 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.364932 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.364956 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364969 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.364976 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.364988 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.364996 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.365003 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365029 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365040 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365048 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:55:05.365060 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.365067 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365074 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365081 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365089 | orchestrator | 2026-01-06 00:55:05.365113 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2026-01-06 00:55:05.365122 | orchestrator | Tuesday 06 January 2026 00:53:41 +0000 (0:00:04.395) 0:05:32.578 ******* 2026-01-06 00:55:05.365144 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 00:55:05.365157 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.365165 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365172 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365179 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365208 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.365221 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.365236 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 00:55:05.365244 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365251 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.365258 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365268 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365280 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365291 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365299 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365306 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 00:55:05.365314 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 00:55:05.365321 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365328 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365346 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.365359 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365367 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.365374 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365381 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:55:05.365407 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365414 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-01-06 00:55:05.365422 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365429 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365436 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365443 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 00:55:05.365451 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 00:55:05.365458 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365469 | orchestrator | 2026-01-06 00:55:05.365476 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2026-01-06 00:55:05.365483 | orchestrator | Tuesday 06 January 2026 00:53:42 +0000 (0:00:00.851) 0:05:33.430 ******* 2026-01-06 00:55:05.365494 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365505 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365514 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365529 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365537 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365544 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365552 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365559 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365567 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365574 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365581 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-01-06 00:55:05.365592 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365603 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-01-06 00:55:05.365611 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365618 | orchestrator | 2026-01-06 00:55:05.365625 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2026-01-06 00:55:05.365632 | orchestrator | Tuesday 06 January 2026 00:53:44 +0000 (0:00:01.496) 0:05:34.926 ******* 2026-01-06 00:55:05.365643 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365650 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365657 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365663 | orchestrator | 2026-01-06 00:55:05.365670 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2026-01-06 00:55:05.365694 | orchestrator | Tuesday 06 January 2026 00:53:44 +0000 (0:00:00.500) 0:05:35.426 ******* 2026-01-06 00:55:05.365702 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365709 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365716 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365723 | orchestrator | 2026-01-06 00:55:05.365730 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2026-01-06 00:55:05.365737 | orchestrator | Tuesday 06 January 2026 00:53:46 +0000 (0:00:01.417) 0:05:36.844 ******* 2026-01-06 00:55:05.365744 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.365751 | orchestrator | 2026-01-06 00:55:05.365758 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2026-01-06 00:55:05.365764 | orchestrator | Tuesday 06 January 2026 00:53:47 +0000 (0:00:01.522) 0:05:38.367 ******* 2026-01-06 00:55:05.365771 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:55:05.365780 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:55:05.365797 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-01-06 00:55:05.365805 | orchestrator | 2026-01-06 00:55:05.365812 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2026-01-06 00:55:05.365818 | orchestrator | Tuesday 06 January 2026 00:53:50 +0000 (0:00:03.128) 0:05:41.495 ******* 2026-01-06 00:55:05.365829 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:55:05.365837 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365844 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:55:05.365852 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365859 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2025.1', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-01-06 00:55:05.365870 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365877 | orchestrator | 2026-01-06 00:55:05.365884 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2026-01-06 00:55:05.365891 | orchestrator | Tuesday 06 January 2026 00:53:51 +0000 (0:00:00.478) 0:05:41.973 ******* 2026-01-06 00:55:05.365899 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-01-06 00:55:05.365907 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365913 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-01-06 00:55:05.365920 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365927 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-01-06 00:55:05.365934 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365941 | orchestrator | 2026-01-06 00:55:05.365951 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2026-01-06 00:55:05.365958 | orchestrator | Tuesday 06 January 2026 00:53:52 +0000 (0:00:00.678) 0:05:42.651 ******* 2026-01-06 00:55:05.365965 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.365972 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.365979 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.365986 | orchestrator | 2026-01-06 00:55:05.365993 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2026-01-06 00:55:05.366000 | orchestrator | Tuesday 06 January 2026 00:53:53 +0000 (0:00:01.319) 0:05:43.970 ******* 2026-01-06 00:55:05.366007 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366051 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366061 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366068 | orchestrator | 2026-01-06 00:55:05.366075 | orchestrator | TASK [include_role : skyline] ************************************************** 2026-01-06 00:55:05.366082 | orchestrator | Tuesday 06 January 2026 00:53:54 +0000 (0:00:01.066) 0:05:45.037 ******* 2026-01-06 00:55:05.366089 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.366096 | orchestrator | 2026-01-06 00:55:05.366103 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2026-01-06 00:55:05.366109 | orchestrator | Tuesday 06 January 2026 00:53:56 +0000 (0:00:01.930) 0:05:46.968 ******* 2026-01-06 00:55:05.366117 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-01-06 00:55:05.366132 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-01-06 00:55:05.366140 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-01-06 00:55:05.366156 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.366165 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.366178 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-01-06 00:55:05.366186 | orchestrator | 2026-01-06 00:55:05.366193 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2026-01-06 00:55:05.366200 | orchestrator | Tuesday 06 January 2026 00:54:03 +0000 (0:00:06.756) 0:05:53.725 ******* 2026-01-06 00:55:05.366211 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-01-06 00:55:05.366222 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.366229 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366237 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-01-06 00:55:05.366249 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.366256 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2025.1', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-01-06 00:55:05.366280 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2025.1', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-01-06 00:55:05.366293 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366300 | orchestrator | 2026-01-06 00:55:05.366312 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2026-01-06 00:55:05.366324 | orchestrator | Tuesday 06 January 2026 00:54:04 +0000 (0:00:01.718) 0:05:55.443 ******* 2026-01-06 00:55:05.366336 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366348 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366361 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366374 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366386 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366394 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366402 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366409 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366416 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366423 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366430 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366441 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-01-06 00:55:05.366448 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366459 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-01-06 00:55:05.366472 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366479 | orchestrator | 2026-01-06 00:55:05.366486 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2026-01-06 00:55:05.366493 | orchestrator | Tuesday 06 January 2026 00:54:05 +0000 (0:00:01.020) 0:05:56.464 ******* 2026-01-06 00:55:05.366500 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.366507 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.366513 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.366520 | orchestrator | 2026-01-06 00:55:05.366527 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2026-01-06 00:55:05.366534 | orchestrator | Tuesday 06 January 2026 00:54:07 +0000 (0:00:01.293) 0:05:57.757 ******* 2026-01-06 00:55:05.366540 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.366547 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.366554 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.366561 | orchestrator | 2026-01-06 00:55:05.366567 | orchestrator | TASK [include_role : tacker] *************************************************** 2026-01-06 00:55:05.366574 | orchestrator | Tuesday 06 January 2026 00:54:09 +0000 (0:00:02.243) 0:06:00.001 ******* 2026-01-06 00:55:05.366581 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366588 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366595 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366602 | orchestrator | 2026-01-06 00:55:05.366609 | orchestrator | TASK [include_role : trove] **************************************************** 2026-01-06 00:55:05.366615 | orchestrator | Tuesday 06 January 2026 00:54:09 +0000 (0:00:00.344) 0:06:00.346 ******* 2026-01-06 00:55:05.366622 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366629 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366636 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366643 | orchestrator | 2026-01-06 00:55:05.366649 | orchestrator | TASK [include_role : venus] **************************************************** 2026-01-06 00:55:05.366657 | orchestrator | Tuesday 06 January 2026 00:54:10 +0000 (0:00:00.726) 0:06:01.072 ******* 2026-01-06 00:55:05.366663 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366670 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366723 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366732 | orchestrator | 2026-01-06 00:55:05.366739 | orchestrator | TASK [include_role : watcher] ************************************************** 2026-01-06 00:55:05.366746 | orchestrator | Tuesday 06 January 2026 00:54:10 +0000 (0:00:00.338) 0:06:01.411 ******* 2026-01-06 00:55:05.366753 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366760 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366767 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366774 | orchestrator | 2026-01-06 00:55:05.366780 | orchestrator | TASK [include_role : zun] ****************************************************** 2026-01-06 00:55:05.366789 | orchestrator | Tuesday 06 January 2026 00:54:11 +0000 (0:00:00.323) 0:06:01.734 ******* 2026-01-06 00:55:05.366795 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.366802 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.366809 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.366816 | orchestrator | 2026-01-06 00:55:05.366823 | orchestrator | TASK [include_role : loadbalancer] ********************************************* 2026-01-06 00:55:05.366830 | orchestrator | Tuesday 06 January 2026 00:54:11 +0000 (0:00:00.332) 0:06:02.067 ******* 2026-01-06 00:55:05.366837 | orchestrator | included: loadbalancer for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:55:05.366844 | orchestrator | 2026-01-06 00:55:05.366853 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-01-06 00:55:05.366861 | orchestrator | Tuesday 06 January 2026 00:54:13 +0000 (0:00:01.949) 0:06:04.017 ******* 2026-01-06 00:55:05.366870 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366893 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366903 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366912 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366921 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366930 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-01-06 00:55:05.366938 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.366979 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.366993 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-01-06 00:55:05.367002 | orchestrator | 2026-01-06 00:55:05.367010 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-01-06 00:55:05.367019 | orchestrator | Tuesday 06 January 2026 00:54:15 +0000 (0:00:02.454) 0:06:06.471 ******* 2026-01-06 00:55:05.367027 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:55:05.367035 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.367047 | orchestrator | } 2026-01-06 00:55:05.367056 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:55:05.367065 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.367073 | orchestrator | } 2026-01-06 00:55:05.367081 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:55:05.367089 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:55:05.367097 | orchestrator | } 2026-01-06 00:55:05.367106 | orchestrator | 2026-01-06 00:55:05.367114 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:55:05.367125 | orchestrator | Tuesday 06 January 2026 00:54:16 +0000 (0:00:00.834) 0:06:07.306 ******* 2026-01-06 00:55:05.367139 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.367157 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.367178 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.367203 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.367218 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.367231 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.367252 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.367266 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.367288 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-01-06 00:55:05.367303 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2025.1', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-01-06 00:55:05.367317 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2025.1', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-01-06 00:55:05.367331 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.367358 | orchestrator | 2026-01-06 00:55:05.367370 | orchestrator | RUNNING HANDLER [loadbalancer : Check IP addresses on the API interface] ******* 2026-01-06 00:55:05.367383 | orchestrator | Tuesday 06 January 2026 00:54:18 +0000 (0:00:01.887) 0:06:09.193 ******* 2026-01-06 00:55:05.367396 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367410 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367423 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367436 | orchestrator | 2026-01-06 00:55:05.367449 | orchestrator | RUNNING HANDLER [loadbalancer : Group HA nodes by status] ********************** 2026-01-06 00:55:05.367462 | orchestrator | Tuesday 06 January 2026 00:54:19 +0000 (0:00:00.742) 0:06:09.936 ******* 2026-01-06 00:55:05.367476 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367490 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367503 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367517 | orchestrator | 2026-01-06 00:55:05.367530 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup keepalived container] ************** 2026-01-06 00:55:05.367543 | orchestrator | Tuesday 06 January 2026 00:54:19 +0000 (0:00:00.352) 0:06:10.289 ******* 2026-01-06 00:55:05.367556 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367571 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367584 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367598 | orchestrator | 2026-01-06 00:55:05.367610 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup haproxy container] ***************** 2026-01-06 00:55:05.367622 | orchestrator | Tuesday 06 January 2026 00:54:20 +0000 (0:00:00.875) 0:06:11.165 ******* 2026-01-06 00:55:05.367635 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367648 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367661 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367673 | orchestrator | 2026-01-06 00:55:05.367708 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup proxysql container] **************** 2026-01-06 00:55:05.367723 | orchestrator | Tuesday 06 January 2026 00:54:21 +0000 (0:00:00.976) 0:06:12.141 ******* 2026-01-06 00:55:05.367737 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367749 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367763 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367774 | orchestrator | 2026-01-06 00:55:05.367787 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup haproxy container] **************** 2026-01-06 00:55:05.367801 | orchestrator | Tuesday 06 January 2026 00:54:22 +0000 (0:00:01.350) 0:06:13.492 ******* 2026-01-06 00:55:05.367814 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.367828 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.367842 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.367855 | orchestrator | 2026-01-06 00:55:05.367869 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup haproxy to start] ************** 2026-01-06 00:55:05.367883 | orchestrator | Tuesday 06 January 2026 00:54:28 +0000 (0:00:05.198) 0:06:18.691 ******* 2026-01-06 00:55:05.367896 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.367911 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.367924 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.367937 | orchestrator | 2026-01-06 00:55:05.367964 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup proxysql container] *************** 2026-01-06 00:55:05.367978 | orchestrator | Tuesday 06 January 2026 00:54:31 +0000 (0:00:03.785) 0:06:22.476 ******* 2026-01-06 00:55:05.367991 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.368004 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.368017 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.368029 | orchestrator | 2026-01-06 00:55:05.368043 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup proxysql to start] ************* 2026-01-06 00:55:05.368057 | orchestrator | Tuesday 06 January 2026 00:54:41 +0000 (0:00:09.904) 0:06:32.380 ******* 2026-01-06 00:55:05.368070 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.368084 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.368098 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.368111 | orchestrator | 2026-01-06 00:55:05.368136 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup keepalived container] ************* 2026-01-06 00:55:05.368165 | orchestrator | Tuesday 06 January 2026 00:54:46 +0000 (0:00:05.178) 0:06:37.559 ******* 2026-01-06 00:55:05.368180 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:55:05.368194 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:55:05.368206 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:55:05.368218 | orchestrator | 2026-01-06 00:55:05.368232 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master haproxy container] ***************** 2026-01-06 00:55:05.368244 | orchestrator | Tuesday 06 January 2026 00:54:57 +0000 (0:00:10.120) 0:06:47.679 ******* 2026-01-06 00:55:05.368256 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368269 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368281 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368295 | orchestrator | 2026-01-06 00:55:05.368306 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master proxysql container] **************** 2026-01-06 00:55:05.368319 | orchestrator | Tuesday 06 January 2026 00:54:57 +0000 (0:00:00.388) 0:06:48.068 ******* 2026-01-06 00:55:05.368333 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368346 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368359 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368370 | orchestrator | 2026-01-06 00:55:05.368382 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master keepalived container] ************** 2026-01-06 00:55:05.368394 | orchestrator | Tuesday 06 January 2026 00:54:57 +0000 (0:00:00.445) 0:06:48.514 ******* 2026-01-06 00:55:05.368408 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368421 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368435 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368449 | orchestrator | 2026-01-06 00:55:05.368462 | orchestrator | RUNNING HANDLER [loadbalancer : Start master haproxy container] **************** 2026-01-06 00:55:05.368474 | orchestrator | Tuesday 06 January 2026 00:54:58 +0000 (0:00:00.874) 0:06:49.388 ******* 2026-01-06 00:55:05.368486 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368498 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368511 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368524 | orchestrator | 2026-01-06 00:55:05.368536 | orchestrator | RUNNING HANDLER [loadbalancer : Start master proxysql container] *************** 2026-01-06 00:55:05.368548 | orchestrator | Tuesday 06 January 2026 00:54:59 +0000 (0:00:00.389) 0:06:49.778 ******* 2026-01-06 00:55:05.368561 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368573 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368586 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368598 | orchestrator | 2026-01-06 00:55:05.368611 | orchestrator | RUNNING HANDLER [loadbalancer : Start master keepalived container] ************* 2026-01-06 00:55:05.368623 | orchestrator | Tuesday 06 January 2026 00:54:59 +0000 (0:00:00.389) 0:06:50.167 ******* 2026-01-06 00:55:05.368635 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:55:05.368647 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:55:05.368660 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:55:05.368672 | orchestrator | 2026-01-06 00:55:05.368717 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for haproxy to listen on VIP] ************* 2026-01-06 00:55:05.368731 | orchestrator | Tuesday 06 January 2026 00:54:59 +0000 (0:00:00.385) 0:06:50.553 ******* 2026-01-06 00:55:05.368743 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.368757 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.368770 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.368782 | orchestrator | 2026-01-06 00:55:05.368795 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for proxysql to listen on VIP] ************ 2026-01-06 00:55:05.368807 | orchestrator | Tuesday 06 January 2026 00:55:01 +0000 (0:00:01.462) 0:06:52.016 ******* 2026-01-06 00:55:05.368820 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:55:05.368833 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:55:05.368846 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:55:05.368859 | orchestrator | 2026-01-06 00:55:05.368871 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:55:05.368886 | orchestrator | testbed-node-0 : ok=127  changed=79  unreachable=0 failed=0 skipped=94  rescued=0 ignored=0 2026-01-06 00:55:05.368917 | orchestrator | testbed-node-1 : ok=126  changed=79  unreachable=0 failed=0 skipped=94  rescued=0 ignored=0 2026-01-06 00:55:05.368931 | orchestrator | testbed-node-2 : ok=126  changed=79  unreachable=0 failed=0 skipped=94  rescued=0 ignored=0 2026-01-06 00:55:05.368944 | orchestrator | 2026-01-06 00:55:05.368956 | orchestrator | 2026-01-06 00:55:05.368969 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:55:05.368982 | orchestrator | Tuesday 06 January 2026 00:55:02 +0000 (0:00:01.027) 0:06:53.043 ******* 2026-01-06 00:55:05.368997 | orchestrator | =============================================================================== 2026-01-06 00:55:05.369010 | orchestrator | loadbalancer : Start backup keepalived container ----------------------- 10.12s 2026-01-06 00:55:05.369023 | orchestrator | loadbalancer : Start backup proxysql container -------------------------- 9.90s 2026-01-06 00:55:05.369036 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 6.76s 2026-01-06 00:55:05.369050 | orchestrator | haproxy-config : Copying over horizon haproxy config -------------------- 6.05s 2026-01-06 00:55:05.369081 | orchestrator | haproxy-config : Copying over aodh haproxy config ----------------------- 5.82s 2026-01-06 00:55:05.369094 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 5.76s 2026-01-06 00:55:05.369107 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 5.47s 2026-01-06 00:55:05.369119 | orchestrator | haproxy-config : Copying over barbican haproxy config ------------------- 5.43s 2026-01-06 00:55:05.369131 | orchestrator | haproxy-config : Copying over manila haproxy config --------------------- 5.22s 2026-01-06 00:55:05.369145 | orchestrator | loadbalancer : Start backup haproxy container --------------------------- 5.20s 2026-01-06 00:55:05.369170 | orchestrator | loadbalancer : Wait for backup proxysql to start ------------------------ 5.18s 2026-01-06 00:55:05.369184 | orchestrator | loadbalancer : Copying over proxysql config ----------------------------- 4.99s 2026-01-06 00:55:05.369197 | orchestrator | haproxy-config : Copying over ceph-rgw haproxy config ------------------- 4.98s 2026-01-06 00:55:05.369211 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 4.91s 2026-01-06 00:55:05.369223 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 4.74s 2026-01-06 00:55:05.369236 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 4.70s 2026-01-06 00:55:05.369249 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 4.40s 2026-01-06 00:55:05.369262 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 4.27s 2026-01-06 00:55:05.369276 | orchestrator | haproxy-config : Copying over grafana haproxy config -------------------- 4.27s 2026-01-06 00:55:05.369290 | orchestrator | haproxy-config : Copying over placement haproxy config ------------------ 4.10s 2026-01-06 00:55:05.369303 | orchestrator | 2026-01-06 00:55:05 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:05.369315 | orchestrator | 2026-01-06 00:55:05 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:05.369326 | orchestrator | 2026-01-06 00:55:05 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:05.369339 | orchestrator | 2026-01-06 00:55:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:08.410858 | orchestrator | 2026-01-06 00:55:08 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:08.412177 | orchestrator | 2026-01-06 00:55:08 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:08.414348 | orchestrator | 2026-01-06 00:55:08 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:08.415335 | orchestrator | 2026-01-06 00:55:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:11.457726 | orchestrator | 2026-01-06 00:55:11 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:11.458124 | orchestrator | 2026-01-06 00:55:11 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:11.459090 | orchestrator | 2026-01-06 00:55:11 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:11.459133 | orchestrator | 2026-01-06 00:55:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:14.498450 | orchestrator | 2026-01-06 00:55:14 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:14.498788 | orchestrator | 2026-01-06 00:55:14 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:14.499797 | orchestrator | 2026-01-06 00:55:14 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:14.499828 | orchestrator | 2026-01-06 00:55:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:17.526507 | orchestrator | 2026-01-06 00:55:17 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:17.527355 | orchestrator | 2026-01-06 00:55:17 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:17.528378 | orchestrator | 2026-01-06 00:55:17 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:17.528413 | orchestrator | 2026-01-06 00:55:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:20.629140 | orchestrator | 2026-01-06 00:55:20 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:20.629529 | orchestrator | 2026-01-06 00:55:20 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:20.630444 | orchestrator | 2026-01-06 00:55:20 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:20.630498 | orchestrator | 2026-01-06 00:55:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:23.679363 | orchestrator | 2026-01-06 00:55:23 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:23.685610 | orchestrator | 2026-01-06 00:55:23 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:23.686818 | orchestrator | 2026-01-06 00:55:23 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:23.686865 | orchestrator | 2026-01-06 00:55:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:26.741191 | orchestrator | 2026-01-06 00:55:26 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:26.741578 | orchestrator | 2026-01-06 00:55:26 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:26.743566 | orchestrator | 2026-01-06 00:55:26 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:26.743612 | orchestrator | 2026-01-06 00:55:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:29.788949 | orchestrator | 2026-01-06 00:55:29 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:29.789227 | orchestrator | 2026-01-06 00:55:29 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:29.793478 | orchestrator | 2026-01-06 00:55:29 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:29.793519 | orchestrator | 2026-01-06 00:55:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:32.823552 | orchestrator | 2026-01-06 00:55:32 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:32.823914 | orchestrator | 2026-01-06 00:55:32 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:32.824467 | orchestrator | 2026-01-06 00:55:32 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:32.824605 | orchestrator | 2026-01-06 00:55:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:35.858405 | orchestrator | 2026-01-06 00:55:35 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:35.858782 | orchestrator | 2026-01-06 00:55:35 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:35.859369 | orchestrator | 2026-01-06 00:55:35 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:35.859405 | orchestrator | 2026-01-06 00:55:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:38.916709 | orchestrator | 2026-01-06 00:55:38 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:38.920757 | orchestrator | 2026-01-06 00:55:38 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:38.920857 | orchestrator | 2026-01-06 00:55:38 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:38.920881 | orchestrator | 2026-01-06 00:55:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:42.006070 | orchestrator | 2026-01-06 00:55:42 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:42.008662 | orchestrator | 2026-01-06 00:55:42 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:42.010810 | orchestrator | 2026-01-06 00:55:42 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:42.010855 | orchestrator | 2026-01-06 00:55:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:45.091291 | orchestrator | 2026-01-06 00:55:45 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:45.091602 | orchestrator | 2026-01-06 00:55:45 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:45.092244 | orchestrator | 2026-01-06 00:55:45 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:45.092409 | orchestrator | 2026-01-06 00:55:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:48.151174 | orchestrator | 2026-01-06 00:55:48 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:48.152505 | orchestrator | 2026-01-06 00:55:48 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:48.155731 | orchestrator | 2026-01-06 00:55:48 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:48.155802 | orchestrator | 2026-01-06 00:55:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:51.212193 | orchestrator | 2026-01-06 00:55:51 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:51.214939 | orchestrator | 2026-01-06 00:55:51 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:51.215977 | orchestrator | 2026-01-06 00:55:51 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:51.216734 | orchestrator | 2026-01-06 00:55:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:54.270211 | orchestrator | 2026-01-06 00:55:54 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:54.273467 | orchestrator | 2026-01-06 00:55:54 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:54.274408 | orchestrator | 2026-01-06 00:55:54 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:54.274679 | orchestrator | 2026-01-06 00:55:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:55:57.323139 | orchestrator | 2026-01-06 00:55:57 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:55:57.324946 | orchestrator | 2026-01-06 00:55:57 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:55:57.327219 | orchestrator | 2026-01-06 00:55:57 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:55:57.327295 | orchestrator | 2026-01-06 00:55:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:00.375962 | orchestrator | 2026-01-06 00:56:00 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:00.376679 | orchestrator | 2026-01-06 00:56:00 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:00.377835 | orchestrator | 2026-01-06 00:56:00 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:00.377853 | orchestrator | 2026-01-06 00:56:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:03.412892 | orchestrator | 2026-01-06 00:56:03 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:03.415184 | orchestrator | 2026-01-06 00:56:03 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:03.417352 | orchestrator | 2026-01-06 00:56:03 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:03.417404 | orchestrator | 2026-01-06 00:56:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:06.464153 | orchestrator | 2026-01-06 00:56:06 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:06.466752 | orchestrator | 2026-01-06 00:56:06 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:06.468736 | orchestrator | 2026-01-06 00:56:06 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:06.468884 | orchestrator | 2026-01-06 00:56:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:09.516392 | orchestrator | 2026-01-06 00:56:09 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:09.518378 | orchestrator | 2026-01-06 00:56:09 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:09.520976 | orchestrator | 2026-01-06 00:56:09 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:09.521463 | orchestrator | 2026-01-06 00:56:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:12.576657 | orchestrator | 2026-01-06 00:56:12 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:12.578831 | orchestrator | 2026-01-06 00:56:12 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:12.581765 | orchestrator | 2026-01-06 00:56:12 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:12.581936 | orchestrator | 2026-01-06 00:56:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:15.635158 | orchestrator | 2026-01-06 00:56:15 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:15.635547 | orchestrator | 2026-01-06 00:56:15 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:15.636571 | orchestrator | 2026-01-06 00:56:15 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:15.636666 | orchestrator | 2026-01-06 00:56:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:18.689424 | orchestrator | 2026-01-06 00:56:18 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:18.693688 | orchestrator | 2026-01-06 00:56:18 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:18.697212 | orchestrator | 2026-01-06 00:56:18 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:18.697291 | orchestrator | 2026-01-06 00:56:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:21.750687 | orchestrator | 2026-01-06 00:56:21 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:21.752139 | orchestrator | 2026-01-06 00:56:21 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:21.753928 | orchestrator | 2026-01-06 00:56:21 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:21.753983 | orchestrator | 2026-01-06 00:56:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:24.802520 | orchestrator | 2026-01-06 00:56:24 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:24.804353 | orchestrator | 2026-01-06 00:56:24 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:24.806301 | orchestrator | 2026-01-06 00:56:24 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:24.806356 | orchestrator | 2026-01-06 00:56:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:27.851653 | orchestrator | 2026-01-06 00:56:27 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:27.853097 | orchestrator | 2026-01-06 00:56:27 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:27.855097 | orchestrator | 2026-01-06 00:56:27 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:27.855700 | orchestrator | 2026-01-06 00:56:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:30.914791 | orchestrator | 2026-01-06 00:56:30 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:30.916960 | orchestrator | 2026-01-06 00:56:30 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:30.919466 | orchestrator | 2026-01-06 00:56:30 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:30.919623 | orchestrator | 2026-01-06 00:56:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:33.966360 | orchestrator | 2026-01-06 00:56:33 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:33.966468 | orchestrator | 2026-01-06 00:56:33 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:33.969986 | orchestrator | 2026-01-06 00:56:33 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:33.970108 | orchestrator | 2026-01-06 00:56:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:37.012626 | orchestrator | 2026-01-06 00:56:37 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:37.013853 | orchestrator | 2026-01-06 00:56:37 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:37.014661 | orchestrator | 2026-01-06 00:56:37 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:37.015370 | orchestrator | 2026-01-06 00:56:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:40.060077 | orchestrator | 2026-01-06 00:56:40 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:40.060993 | orchestrator | 2026-01-06 00:56:40 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:40.061407 | orchestrator | 2026-01-06 00:56:40 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:40.061431 | orchestrator | 2026-01-06 00:56:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:43.111238 | orchestrator | 2026-01-06 00:56:43 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:43.112507 | orchestrator | 2026-01-06 00:56:43 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:43.115218 | orchestrator | 2026-01-06 00:56:43 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:43.115292 | orchestrator | 2026-01-06 00:56:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:46.153724 | orchestrator | 2026-01-06 00:56:46 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:46.158678 | orchestrator | 2026-01-06 00:56:46 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:46.162729 | orchestrator | 2026-01-06 00:56:46 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:46.162827 | orchestrator | 2026-01-06 00:56:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:49.220732 | orchestrator | 2026-01-06 00:56:49 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:49.222608 | orchestrator | 2026-01-06 00:56:49 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:49.224689 | orchestrator | 2026-01-06 00:56:49 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:49.224757 | orchestrator | 2026-01-06 00:56:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:52.285931 | orchestrator | 2026-01-06 00:56:52 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:52.288933 | orchestrator | 2026-01-06 00:56:52 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:52.291508 | orchestrator | 2026-01-06 00:56:52 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:52.291619 | orchestrator | 2026-01-06 00:56:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:55.334097 | orchestrator | 2026-01-06 00:56:55 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state STARTED 2026-01-06 00:56:55.336201 | orchestrator | 2026-01-06 00:56:55 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:55.337180 | orchestrator | 2026-01-06 00:56:55 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:55.337223 | orchestrator | 2026-01-06 00:56:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:56:58.393941 | orchestrator | 2026-01-06 00:56:58 | INFO  | Task 96ce40de-3365-41dd-b8e5-994d64bfaffc is in state SUCCESS 2026-01-06 00:56:58.394072 | orchestrator | 2026-01-06 00:56:58.397018 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-01-06 00:56:58.397108 | orchestrator | 2.16.14 2026-01-06 00:56:58.397158 | orchestrator | 2026-01-06 00:56:58.397165 | orchestrator | PLAY [Prepare deployment of Ceph services] ************************************* 2026-01-06 00:56:58.397170 | orchestrator | 2026-01-06 00:56:58.397175 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-01-06 00:56:58.397180 | orchestrator | Tuesday 06 January 2026 00:45:18 +0000 (0:00:00.895) 0:00:00.895 ******* 2026-01-06 00:56:58.397186 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.397192 | orchestrator | 2026-01-06 00:56:58.397197 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-01-06 00:56:58.397202 | orchestrator | Tuesday 06 January 2026 00:45:19 +0000 (0:00:01.370) 0:00:02.266 ******* 2026-01-06 00:56:58.397206 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397214 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397222 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397229 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397237 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397244 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397252 | orchestrator | 2026-01-06 00:56:58.397260 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-01-06 00:56:58.397268 | orchestrator | Tuesday 06 January 2026 00:45:21 +0000 (0:00:01.908) 0:00:04.174 ******* 2026-01-06 00:56:58.397276 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397283 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397288 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397294 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397301 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397308 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397314 | orchestrator | 2026-01-06 00:56:58.397321 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-01-06 00:56:58.397329 | orchestrator | Tuesday 06 January 2026 00:45:22 +0000 (0:00:00.777) 0:00:04.951 ******* 2026-01-06 00:56:58.397335 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397341 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397348 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397355 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397361 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397368 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397375 | orchestrator | 2026-01-06 00:56:58.397381 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-01-06 00:56:58.397389 | orchestrator | Tuesday 06 January 2026 00:45:23 +0000 (0:00:00.863) 0:00:05.814 ******* 2026-01-06 00:56:58.397396 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397402 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397409 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397506 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397516 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397592 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397602 | orchestrator | 2026-01-06 00:56:58.397610 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-01-06 00:56:58.397673 | orchestrator | Tuesday 06 January 2026 00:45:23 +0000 (0:00:00.756) 0:00:06.571 ******* 2026-01-06 00:56:58.397679 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397685 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397690 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397696 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397703 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397711 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397720 | orchestrator | 2026-01-06 00:56:58.397728 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-01-06 00:56:58.397737 | orchestrator | Tuesday 06 January 2026 00:45:24 +0000 (0:00:00.606) 0:00:07.178 ******* 2026-01-06 00:56:58.397745 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.397753 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.397761 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.397779 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.397788 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.397795 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.397803 | orchestrator | 2026-01-06 00:56:58.397811 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-01-06 00:56:58.397835 | orchestrator | Tuesday 06 January 2026 00:45:25 +0000 (0:00:00.855) 0:00:08.033 ******* 2026-01-06 00:56:58.397844 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.398159 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.398172 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.398177 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.398182 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.398187 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.398191 | orchestrator | 2026-01-06 00:56:58.398196 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-01-06 00:56:58.398201 | orchestrator | Tuesday 06 January 2026 00:45:26 +0000 (0:00:01.096) 0:00:09.130 ******* 2026-01-06 00:56:58.398206 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.398211 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.398219 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.398226 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.398234 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.398241 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.398250 | orchestrator | 2026-01-06 00:56:58.398259 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-01-06 00:56:58.398485 | orchestrator | Tuesday 06 January 2026 00:45:27 +0000 (0:00:01.290) 0:00:10.421 ******* 2026-01-06 00:56:58.398496 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:56:58.398502 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.398507 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.398512 | orchestrator | 2026-01-06 00:56:58.398516 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-01-06 00:56:58.398521 | orchestrator | Tuesday 06 January 2026 00:45:28 +0000 (0:00:00.824) 0:00:11.245 ******* 2026-01-06 00:56:58.398546 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.398555 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.398563 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.398605 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.398616 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.398624 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.398631 | orchestrator | 2026-01-06 00:56:58.398638 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-01-06 00:56:58.398758 | orchestrator | Tuesday 06 January 2026 00:45:30 +0000 (0:00:01.884) 0:00:13.130 ******* 2026-01-06 00:56:58.398767 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:56:58.398775 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.398782 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.398790 | orchestrator | 2026-01-06 00:56:58.398797 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-01-06 00:56:58.399831 | orchestrator | Tuesday 06 January 2026 00:45:33 +0000 (0:00:03.232) 0:00:16.363 ******* 2026-01-06 00:56:58.399838 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-01-06 00:56:58.399844 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-01-06 00:56:58.399848 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-01-06 00:56:58.399853 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.399858 | orchestrator | 2026-01-06 00:56:58.399863 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-01-06 00:56:58.399869 | orchestrator | Tuesday 06 January 2026 00:45:34 +0000 (0:00:00.690) 0:00:17.053 ******* 2026-01-06 00:56:58.400238 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400263 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400270 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400277 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400285 | orchestrator | 2026-01-06 00:56:58.400293 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-01-06 00:56:58.400301 | orchestrator | Tuesday 06 January 2026 00:45:35 +0000 (0:00:01.249) 0:00:18.303 ******* 2026-01-06 00:56:58.400312 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400332 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400341 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400346 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400350 | orchestrator | 2026-01-06 00:56:58.400355 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-01-06 00:56:58.400360 | orchestrator | Tuesday 06 January 2026 00:45:36 +0000 (0:00:00.508) 0:00:18.811 ******* 2026-01-06 00:56:58.400442 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-01-06 00:45:31.083934', 'end': '2026-01-06 00:45:31.390050', 'delta': '0:00:00.306116', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400458 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-01-06 00:45:32.187996', 'end': '2026-01-06 00:45:32.524397', 'delta': '0:00:00.336401', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400472 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-01-06 00:45:33.032134', 'end': '2026-01-06 00:45:33.351855', 'delta': '0:00:00.319721', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.400477 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400481 | orchestrator | 2026-01-06 00:56:58.400486 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-01-06 00:56:58.400491 | orchestrator | Tuesday 06 January 2026 00:45:36 +0000 (0:00:00.270) 0:00:19.081 ******* 2026-01-06 00:56:58.400495 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.400501 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.400563 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.400569 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.400574 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.400579 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.400583 | orchestrator | 2026-01-06 00:56:58.400588 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-01-06 00:56:58.400593 | orchestrator | Tuesday 06 January 2026 00:45:38 +0000 (0:00:02.343) 0:00:21.425 ******* 2026-01-06 00:56:58.400598 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.400602 | orchestrator | 2026-01-06 00:56:58.400607 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-01-06 00:56:58.400612 | orchestrator | Tuesday 06 January 2026 00:45:39 +0000 (0:00:01.057) 0:00:22.482 ******* 2026-01-06 00:56:58.400616 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400621 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400626 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400630 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400635 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400640 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400644 | orchestrator | 2026-01-06 00:56:58.400649 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-01-06 00:56:58.400658 | orchestrator | Tuesday 06 January 2026 00:45:40 +0000 (0:00:01.199) 0:00:23.682 ******* 2026-01-06 00:56:58.400663 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400668 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400672 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400677 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400681 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400686 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400691 | orchestrator | 2026-01-06 00:56:58.400695 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-01-06 00:56:58.400700 | orchestrator | Tuesday 06 January 2026 00:45:42 +0000 (0:00:01.981) 0:00:25.663 ******* 2026-01-06 00:56:58.400705 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400710 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400714 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400719 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400723 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400728 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400733 | orchestrator | 2026-01-06 00:56:58.400737 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-01-06 00:56:58.400747 | orchestrator | Tuesday 06 January 2026 00:45:44 +0000 (0:00:01.353) 0:00:27.016 ******* 2026-01-06 00:56:58.400751 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400756 | orchestrator | 2026-01-06 00:56:58.400761 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-01-06 00:56:58.400765 | orchestrator | Tuesday 06 January 2026 00:45:44 +0000 (0:00:00.174) 0:00:27.191 ******* 2026-01-06 00:56:58.400770 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400775 | orchestrator | 2026-01-06 00:56:58.400779 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-01-06 00:56:58.400784 | orchestrator | Tuesday 06 January 2026 00:45:44 +0000 (0:00:00.251) 0:00:27.443 ******* 2026-01-06 00:56:58.400789 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400793 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400798 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400843 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400850 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400855 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400860 | orchestrator | 2026-01-06 00:56:58.400865 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-01-06 00:56:58.400870 | orchestrator | Tuesday 06 January 2026 00:45:45 +0000 (0:00:00.674) 0:00:28.117 ******* 2026-01-06 00:56:58.400874 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400879 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400884 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400888 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400893 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400897 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400902 | orchestrator | 2026-01-06 00:56:58.400906 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-01-06 00:56:58.400911 | orchestrator | Tuesday 06 January 2026 00:45:46 +0000 (0:00:01.320) 0:00:29.438 ******* 2026-01-06 00:56:58.400915 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400920 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400924 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400929 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400933 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400938 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400943 | orchestrator | 2026-01-06 00:56:58.400948 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-01-06 00:56:58.400952 | orchestrator | Tuesday 06 January 2026 00:45:47 +0000 (0:00:01.018) 0:00:30.456 ******* 2026-01-06 00:56:58.400957 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.400962 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.400966 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.400971 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.400975 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.400980 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.400984 | orchestrator | 2026-01-06 00:56:58.400989 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-01-06 00:56:58.400993 | orchestrator | Tuesday 06 January 2026 00:45:48 +0000 (0:00:00.864) 0:00:31.321 ******* 2026-01-06 00:56:58.400998 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.401002 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.401007 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.401012 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.401016 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.401021 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.401025 | orchestrator | 2026-01-06 00:56:58.401030 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-01-06 00:56:58.401034 | orchestrator | Tuesday 06 January 2026 00:45:49 +0000 (0:00:00.611) 0:00:31.932 ******* 2026-01-06 00:56:58.401039 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.401047 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.401052 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.401057 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.401061 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.401066 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.401070 | orchestrator | 2026-01-06 00:56:58.401075 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-01-06 00:56:58.401080 | orchestrator | Tuesday 06 January 2026 00:45:50 +0000 (0:00:01.008) 0:00:32.940 ******* 2026-01-06 00:56:58.401084 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.401089 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.401094 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.401098 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.401102 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.401107 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.401111 | orchestrator | 2026-01-06 00:56:58.401116 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-01-06 00:56:58.401121 | orchestrator | Tuesday 06 January 2026 00:45:50 +0000 (0:00:00.583) 0:00:33.523 ******* 2026-01-06 00:56:58.401133 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382', 'dm-uuid-LVM-leNR8e0LegQCMdL6ucMKdN07fh5N5SuCAUHpjmmFqkkv8cgfcG4OQCk1bATKEOxo'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401141 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b', 'dm-uuid-LVM-hKAA9ELaJ4PXB3FsxE7aWN0Ca65H3DNcDeRaQ8myegtafvn7obDSCCodWGTEd481'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401197 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401208 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401236 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401243 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401258 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401265 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401274 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401282 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401331 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401345 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a', 'dm-uuid-LVM-LDweexZgnixRPyaZEXyjMea8qEKMICtA7IzHB9qtV3AIAvWVWiM14y0g6id7UZYZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401351 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-oXsVdg-yMut-PRiK-dfGm-Pr3Q-1gzN-GvYndT', 'scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604', 'scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401360 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3zJeUG-r1bU-MUbW-4daS-IHQE-DfNT-ElqHh1', 'scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac', 'scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401365 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7', 'dm-uuid-LVM-Ke0ebcxjjDzRywv3R5obBtBuMmzv68aYQAzg56kueDNDYW1ZSJhWGfYNPDa8J2Ge'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401405 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088', 'scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401413 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401422 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401428 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-02-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401432 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401437 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401445 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401450 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401455 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401504 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401514 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9', 'dm-uuid-LVM-lFNjrI9z6jGvFHezfUtduDKx9CNSXgEPFaHR8oR5ZfJBhMXXuDDrOG9EnSv6tdIs'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401556 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401566 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309', 'dm-uuid-LVM-WBEZ6WMsGhewarWIW3qNudyEuUl9274MP5F99LYKaEU18gOKabMHCbX9lpi9DDDw'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401622 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-5K7mse-fAuc-dbI5-SiaB-plhi-xXDs-vEQzBN', 'scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585', 'scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401632 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401645 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401653 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401659 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401689 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9U19bv-EwEy-Ks4f-MiiE-9ta0-FWks-EZUgCO', 'scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6', 'scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401698 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.401707 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401715 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6', 'scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401770 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401778 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-56-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401789 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401794 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401803 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401843 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zBUffM-PitN-uGRi-WCUM-hCv5-dceE-VL6GDm', 'scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5', 'scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401855 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Kf4ZcO-WYqz-GeR6-hWC5-gIYh-YPXw-Qrj6Vh', 'scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3', 'scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401860 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59', 'scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401865 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-04-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.401873 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401878 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401883 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401888 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401972 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401979 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401984 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401989 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.401998 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part1', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part14', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part15', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part16', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402104 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-54-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402122 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.402127 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402132 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402137 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402142 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.402147 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.402152 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402157 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402166 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402171 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402176 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402227 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402235 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part1', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part14', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part15', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part16', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402240 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402249 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402257 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-00-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402270 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402277 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.402338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402346 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402351 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402356 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:56:58.402365 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part1', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part14', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part15', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part16', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402412 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-57-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:56:58.402419 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.402424 | orchestrator | 2026-01-06 00:56:58.402429 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-01-06 00:56:58.402434 | orchestrator | Tuesday 06 January 2026 00:45:52 +0000 (0:00:01.366) 0:00:34.889 ******* 2026-01-06 00:56:58.402440 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382', 'dm-uuid-LVM-leNR8e0LegQCMdL6ucMKdN07fh5N5SuCAUHpjmmFqkkv8cgfcG4OQCk1bATKEOxo'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402446 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b', 'dm-uuid-LVM-hKAA9ELaJ4PXB3FsxE7aWN0Ca65H3DNcDeRaQ8myegtafvn7obDSCCodWGTEd481'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402451 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402460 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402469 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402508 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a', 'dm-uuid-LVM-LDweexZgnixRPyaZEXyjMea8qEKMICtA7IzHB9qtV3AIAvWVWiM14y0g6id7UZYZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402515 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7', 'dm-uuid-LVM-Ke0ebcxjjDzRywv3R5obBtBuMmzv68aYQAzg56kueDNDYW1ZSJhWGfYNPDa8J2Ge'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402520 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402576 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402588 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402621 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402685 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402698 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402707 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402715 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402723 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402742 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402752 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402816 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402825 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402843 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402883 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-oXsVdg-yMut-PRiK-dfGm-Pr3Q-1gzN-GvYndT', 'scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604', 'scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402891 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402904 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3zJeUG-r1bU-MUbW-4daS-IHQE-DfNT-ElqHh1', 'scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac', 'scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402938 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-5K7mse-fAuc-dbI5-SiaB-plhi-xXDs-vEQzBN', 'scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585', 'scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402945 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088', 'scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402950 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9U19bv-EwEy-Ks4f-MiiE-9ta0-FWks-EZUgCO', 'scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6', 'scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402958 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-02-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.402967 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6', 'scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403015 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-56-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403022 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9', 'dm-uuid-LVM-lFNjrI9z6jGvFHezfUtduDKx9CNSXgEPFaHR8oR5ZfJBhMXXuDDrOG9EnSv6tdIs'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403027 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309', 'dm-uuid-LVM-WBEZ6WMsGhewarWIW3qNudyEuUl9274MP5F99LYKaEU18gOKabMHCbX9lpi9DDDw'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403032 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403040 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.403049 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403053 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403091 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403097 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403102 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.403107 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403111 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403123 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403128 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403164 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403172 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403184 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403188 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zBUffM-PitN-uGRi-WCUM-hCv5-dceE-VL6GDm', 'scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5', 'scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403232 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403239 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403243 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403252 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403259 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403264 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403268 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403305 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403312 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403317 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403325 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Kf4ZcO-WYqz-GeR6-hWC5-gIYh-YPXw-Qrj6Vh', 'scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3', 'scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403335 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403340 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403380 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403387 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59', 'scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403399 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part1', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part14', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part15', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part16', 'scsi-SQEMU_QEMU_HARDDISK_7779f290-0ef0-4a1b-8fc8-5ce02b31935f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403426 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part1', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part14', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part15', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part16', 'scsi-SQEMU_QEMU_HARDDISK_3d7a3417-f72a-4d7a-b186-e6ea45f4b5be-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403436 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-54-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403444 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-04-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403448 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.403480 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-00-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403489 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.403496 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.403503 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403511 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403520 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403544 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403551 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403558 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403622 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403661 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403676 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part1', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part14', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part15', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part16', 'scsi-SQEMU_QEMU_HARDDISK_9143344a-f7a0-4978-a962-686e689e6a1f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403681 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-57-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:56:58.403685 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.403690 | orchestrator | 2026-01-06 00:56:58.403728 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-01-06 00:56:58.403734 | orchestrator | Tuesday 06 January 2026 00:45:53 +0000 (0:00:01.455) 0:00:36.345 ******* 2026-01-06 00:56:58.403738 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.403743 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.403747 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.403751 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.403755 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.403791 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.403795 | orchestrator | 2026-01-06 00:56:58.403799 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-01-06 00:56:58.403808 | orchestrator | Tuesday 06 January 2026 00:45:55 +0000 (0:00:01.511) 0:00:37.856 ******* 2026-01-06 00:56:58.403812 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.403816 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.403821 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.403825 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.403829 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.403833 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.403837 | orchestrator | 2026-01-06 00:56:58.403841 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-01-06 00:56:58.403845 | orchestrator | Tuesday 06 January 2026 00:45:56 +0000 (0:00:01.702) 0:00:39.560 ******* 2026-01-06 00:56:58.403849 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.403860 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.403864 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.403868 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.403872 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.403876 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.403880 | orchestrator | 2026-01-06 00:56:58.403884 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-01-06 00:56:58.403888 | orchestrator | Tuesday 06 January 2026 00:45:57 +0000 (0:00:01.125) 0:00:40.686 ******* 2026-01-06 00:56:58.403893 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.403897 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.403901 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.403905 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.403909 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.403913 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.403917 | orchestrator | 2026-01-06 00:56:58.403921 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-01-06 00:56:58.403925 | orchestrator | Tuesday 06 January 2026 00:45:59 +0000 (0:00:01.160) 0:00:41.847 ******* 2026-01-06 00:56:58.403929 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.403934 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.403938 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.403942 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.403946 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.403950 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.403954 | orchestrator | 2026-01-06 00:56:58.403958 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-01-06 00:56:58.403962 | orchestrator | Tuesday 06 January 2026 00:46:00 +0000 (0:00:01.772) 0:00:43.619 ******* 2026-01-06 00:56:58.403966 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.403970 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.403975 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.403979 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.403983 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.403987 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.403991 | orchestrator | 2026-01-06 00:56:58.403995 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-01-06 00:56:58.403999 | orchestrator | Tuesday 06 January 2026 00:46:01 +0000 (0:00:00.888) 0:00:44.507 ******* 2026-01-06 00:56:58.404003 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-01-06 00:56:58.404007 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-01-06 00:56:58.404011 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-01-06 00:56:58.404015 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-01-06 00:56:58.404020 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-01-06 00:56:58.404024 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-01-06 00:56:58.404031 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-01-06 00:56:58.404035 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-01-06 00:56:58.404052 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-01-06 00:56:58.404056 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-01-06 00:56:58.404060 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-0) 2026-01-06 00:56:58.404064 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-01-06 00:56:58.404069 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-1) 2026-01-06 00:56:58.404073 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-01-06 00:56:58.404077 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-2) 2026-01-06 00:56:58.404081 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-0) 2026-01-06 00:56:58.404085 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-1) 2026-01-06 00:56:58.404089 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-2) 2026-01-06 00:56:58.404093 | orchestrator | 2026-01-06 00:56:58.404097 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-01-06 00:56:58.404101 | orchestrator | Tuesday 06 January 2026 00:46:06 +0000 (0:00:04.505) 0:00:49.013 ******* 2026-01-06 00:56:58.404106 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-01-06 00:56:58.404110 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-01-06 00:56:58.404114 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-01-06 00:56:58.404118 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404122 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-01-06 00:56:58.404126 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-01-06 00:56:58.404130 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-01-06 00:56:58.404134 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404138 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-01-06 00:56:58.404165 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-01-06 00:56:58.404173 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-01-06 00:56:58.404179 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404184 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:56:58.404190 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:56:58.404197 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:56:58.404204 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.404210 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-01-06 00:56:58.404217 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-01-06 00:56:58.404223 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-01-06 00:56:58.404230 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.404236 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-01-06 00:56:58.404240 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-01-06 00:56:58.404244 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-01-06 00:56:58.404248 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.404252 | orchestrator | 2026-01-06 00:56:58.404257 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-01-06 00:56:58.404261 | orchestrator | Tuesday 06 January 2026 00:46:07 +0000 (0:00:00.953) 0:00:49.966 ******* 2026-01-06 00:56:58.404265 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.404269 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.404273 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.404277 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.404282 | orchestrator | 2026-01-06 00:56:58.404286 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-01-06 00:56:58.404291 | orchestrator | Tuesday 06 January 2026 00:46:08 +0000 (0:00:01.754) 0:00:51.721 ******* 2026-01-06 00:56:58.404301 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404305 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404309 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404313 | orchestrator | 2026-01-06 00:56:58.404318 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-01-06 00:56:58.404322 | orchestrator | Tuesday 06 January 2026 00:46:09 +0000 (0:00:00.435) 0:00:52.156 ******* 2026-01-06 00:56:58.404326 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404330 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404334 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404338 | orchestrator | 2026-01-06 00:56:58.404342 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-01-06 00:56:58.404347 | orchestrator | Tuesday 06 January 2026 00:46:09 +0000 (0:00:00.428) 0:00:52.584 ******* 2026-01-06 00:56:58.404351 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404355 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404359 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404363 | orchestrator | 2026-01-06 00:56:58.404368 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-01-06 00:56:58.404373 | orchestrator | Tuesday 06 January 2026 00:46:10 +0000 (0:00:00.881) 0:00:53.465 ******* 2026-01-06 00:56:58.404378 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.404383 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.404388 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.404393 | orchestrator | 2026-01-06 00:56:58.404397 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-01-06 00:56:58.404402 | orchestrator | Tuesday 06 January 2026 00:46:11 +0000 (0:00:00.802) 0:00:54.268 ******* 2026-01-06 00:56:58.404407 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.404412 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.404420 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.404425 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404429 | orchestrator | 2026-01-06 00:56:58.404434 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-01-06 00:56:58.404439 | orchestrator | Tuesday 06 January 2026 00:46:11 +0000 (0:00:00.340) 0:00:54.608 ******* 2026-01-06 00:56:58.404444 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.404448 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.404453 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.404458 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404463 | orchestrator | 2026-01-06 00:56:58.404468 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-01-06 00:56:58.404473 | orchestrator | Tuesday 06 January 2026 00:46:12 +0000 (0:00:00.392) 0:00:55.000 ******* 2026-01-06 00:56:58.404478 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.404483 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.404488 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.404493 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404498 | orchestrator | 2026-01-06 00:56:58.404502 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-01-06 00:56:58.404507 | orchestrator | Tuesday 06 January 2026 00:46:12 +0000 (0:00:00.496) 0:00:55.497 ******* 2026-01-06 00:56:58.404512 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.404517 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.404522 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.404579 | orchestrator | 2026-01-06 00:56:58.404586 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-01-06 00:56:58.404592 | orchestrator | Tuesday 06 January 2026 00:46:13 +0000 (0:00:00.636) 0:00:56.133 ******* 2026-01-06 00:56:58.404597 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-01-06 00:56:58.404602 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-01-06 00:56:58.404635 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-01-06 00:56:58.404640 | orchestrator | 2026-01-06 00:56:58.404645 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-01-06 00:56:58.404650 | orchestrator | Tuesday 06 January 2026 00:46:14 +0000 (0:00:01.183) 0:00:57.317 ******* 2026-01-06 00:56:58.404655 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:56:58.404660 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.404665 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.404670 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-01-06 00:56:58.404675 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-01-06 00:56:58.404680 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-01-06 00:56:58.404685 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-01-06 00:56:58.404689 | orchestrator | 2026-01-06 00:56:58.404694 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-01-06 00:56:58.404698 | orchestrator | Tuesday 06 January 2026 00:46:15 +0000 (0:00:00.718) 0:00:58.035 ******* 2026-01-06 00:56:58.404703 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:56:58.404709 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.404714 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.404718 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-01-06 00:56:58.404724 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-01-06 00:56:58.404729 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-01-06 00:56:58.404734 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-01-06 00:56:58.404738 | orchestrator | 2026-01-06 00:56:58.404742 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.404746 | orchestrator | Tuesday 06 January 2026 00:46:17 +0000 (0:00:02.689) 0:01:00.725 ******* 2026-01-06 00:56:58.404751 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.404758 | orchestrator | 2026-01-06 00:56:58.404762 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.404766 | orchestrator | Tuesday 06 January 2026 00:46:19 +0000 (0:00:01.400) 0:01:02.125 ******* 2026-01-06 00:56:58.404770 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.404774 | orchestrator | 2026-01-06 00:56:58.404778 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.404782 | orchestrator | Tuesday 06 January 2026 00:46:20 +0000 (0:00:01.534) 0:01:03.660 ******* 2026-01-06 00:56:58.404787 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404791 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404795 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404799 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.404803 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.404807 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.404811 | orchestrator | 2026-01-06 00:56:58.404815 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.404825 | orchestrator | Tuesday 06 January 2026 00:46:22 +0000 (0:00:01.249) 0:01:04.909 ******* 2026-01-06 00:56:58.404829 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.404836 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.404841 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.404845 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.404849 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.404853 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.404858 | orchestrator | 2026-01-06 00:56:58.404862 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.404866 | orchestrator | Tuesday 06 January 2026 00:46:23 +0000 (0:00:00.983) 0:01:05.892 ******* 2026-01-06 00:56:58.404870 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.404874 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.404878 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.404882 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.404886 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.404891 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.404895 | orchestrator | 2026-01-06 00:56:58.404899 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.404905 | orchestrator | Tuesday 06 January 2026 00:46:24 +0000 (0:00:00.988) 0:01:06.881 ******* 2026-01-06 00:56:58.404912 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.404918 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.404926 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.404933 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.404939 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.404946 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.404952 | orchestrator | 2026-01-06 00:56:58.404958 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.404966 | orchestrator | Tuesday 06 January 2026 00:46:24 +0000 (0:00:00.777) 0:01:07.659 ******* 2026-01-06 00:56:58.404973 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.404979 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.404987 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.404994 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405001 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405033 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405041 | orchestrator | 2026-01-06 00:56:58.405048 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.405056 | orchestrator | Tuesday 06 January 2026 00:46:26 +0000 (0:00:01.481) 0:01:09.140 ******* 2026-01-06 00:56:58.405064 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405070 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405077 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405083 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405089 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405095 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405101 | orchestrator | 2026-01-06 00:56:58.405107 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.405113 | orchestrator | Tuesday 06 January 2026 00:46:27 +0000 (0:00:00.884) 0:01:10.025 ******* 2026-01-06 00:56:58.405119 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405125 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405131 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405137 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405143 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405150 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405156 | orchestrator | 2026-01-06 00:56:58.405162 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.405169 | orchestrator | Tuesday 06 January 2026 00:46:28 +0000 (0:00:01.025) 0:01:11.050 ******* 2026-01-06 00:56:58.405175 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405181 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405187 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405194 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405201 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405213 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405221 | orchestrator | 2026-01-06 00:56:58.405225 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.405229 | orchestrator | Tuesday 06 January 2026 00:46:29 +0000 (0:00:01.236) 0:01:12.286 ******* 2026-01-06 00:56:58.405233 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405237 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405241 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405245 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405248 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405252 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405256 | orchestrator | 2026-01-06 00:56:58.405260 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.405264 | orchestrator | Tuesday 06 January 2026 00:46:31 +0000 (0:00:02.147) 0:01:14.434 ******* 2026-01-06 00:56:58.405268 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405272 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405276 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405280 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405283 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405287 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405291 | orchestrator | 2026-01-06 00:56:58.405295 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.405299 | orchestrator | Tuesday 06 January 2026 00:46:34 +0000 (0:00:02.491) 0:01:16.926 ******* 2026-01-06 00:56:58.405302 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405306 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405310 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405314 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405317 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405321 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405325 | orchestrator | 2026-01-06 00:56:58.405329 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.405332 | orchestrator | Tuesday 06 January 2026 00:46:35 +0000 (0:00:01.193) 0:01:18.119 ******* 2026-01-06 00:56:58.405336 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405340 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405344 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405348 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405352 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405355 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405359 | orchestrator | 2026-01-06 00:56:58.405363 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.405370 | orchestrator | Tuesday 06 January 2026 00:46:36 +0000 (0:00:01.192) 0:01:19.311 ******* 2026-01-06 00:56:58.405374 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405378 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405382 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405386 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405389 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405393 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405397 | orchestrator | 2026-01-06 00:56:58.405401 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.405404 | orchestrator | Tuesday 06 January 2026 00:46:37 +0000 (0:00:00.719) 0:01:20.031 ******* 2026-01-06 00:56:58.405408 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405412 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405416 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405420 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405423 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405427 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405431 | orchestrator | 2026-01-06 00:56:58.405437 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.405442 | orchestrator | Tuesday 06 January 2026 00:46:37 +0000 (0:00:00.771) 0:01:20.803 ******* 2026-01-06 00:56:58.405453 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405459 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405465 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405471 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405476 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405480 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405484 | orchestrator | 2026-01-06 00:56:58.405488 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.405491 | orchestrator | Tuesday 06 January 2026 00:46:38 +0000 (0:00:00.990) 0:01:21.793 ******* 2026-01-06 00:56:58.405495 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405499 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405503 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405506 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405547 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405552 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405556 | orchestrator | 2026-01-06 00:56:58.405560 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.405564 | orchestrator | Tuesday 06 January 2026 00:46:40 +0000 (0:00:01.126) 0:01:22.919 ******* 2026-01-06 00:56:58.405568 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405571 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405575 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405579 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405583 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405587 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405590 | orchestrator | 2026-01-06 00:56:58.405594 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.405598 | orchestrator | Tuesday 06 January 2026 00:46:41 +0000 (0:00:00.937) 0:01:23.857 ******* 2026-01-06 00:56:58.405602 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405606 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405610 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405614 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405617 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405621 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405625 | orchestrator | 2026-01-06 00:56:58.405629 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.405633 | orchestrator | Tuesday 06 January 2026 00:46:41 +0000 (0:00:00.826) 0:01:24.683 ******* 2026-01-06 00:56:58.405637 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.405641 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.405644 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.405648 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.405654 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.405660 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.405666 | orchestrator | 2026-01-06 00:56:58.405672 | orchestrator | TASK [ceph-container-common : Generate systemd ceph target file] *************** 2026-01-06 00:56:58.405678 | orchestrator | Tuesday 06 January 2026 00:46:43 +0000 (0:00:01.800) 0:01:26.484 ******* 2026-01-06 00:56:58.405684 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.405689 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.405695 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.405702 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.405708 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.405714 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.405720 | orchestrator | 2026-01-06 00:56:58.405724 | orchestrator | TASK [ceph-container-common : Enable ceph.target] ****************************** 2026-01-06 00:56:58.405728 | orchestrator | Tuesday 06 January 2026 00:46:45 +0000 (0:00:01.655) 0:01:28.140 ******* 2026-01-06 00:56:58.405732 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.405736 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.405739 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.405743 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.405751 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.405756 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.405759 | orchestrator | 2026-01-06 00:56:58.405763 | orchestrator | TASK [ceph-container-common : Include prerequisites.yml] *********************** 2026-01-06 00:56:58.405767 | orchestrator | Tuesday 06 January 2026 00:46:47 +0000 (0:00:02.502) 0:01:30.643 ******* 2026-01-06 00:56:58.405771 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/prerequisites.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.405775 | orchestrator | 2026-01-06 00:56:58.405779 | orchestrator | TASK [ceph-container-common : Stop lvmetad] ************************************ 2026-01-06 00:56:58.405783 | orchestrator | Tuesday 06 January 2026 00:46:49 +0000 (0:00:01.684) 0:01:32.327 ******* 2026-01-06 00:56:58.405786 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405790 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405794 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405798 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405801 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405805 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405809 | orchestrator | 2026-01-06 00:56:58.405817 | orchestrator | TASK [ceph-container-common : Disable and mask lvmetad service] **************** 2026-01-06 00:56:58.405821 | orchestrator | Tuesday 06 January 2026 00:46:50 +0000 (0:00:00.779) 0:01:33.107 ******* 2026-01-06 00:56:58.405825 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.405828 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.405832 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.405836 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.405840 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.405844 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.405847 | orchestrator | 2026-01-06 00:56:58.405851 | orchestrator | TASK [ceph-container-common : Remove ceph udev rules] ************************** 2026-01-06 00:56:58.405855 | orchestrator | Tuesday 06 January 2026 00:46:51 +0000 (0:00:01.012) 0:01:34.120 ******* 2026-01-06 00:56:58.405858 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405862 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405866 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405870 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405873 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405877 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-01-06 00:56:58.405881 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405885 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405889 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405893 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405915 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405920 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-01-06 00:56:58.405923 | orchestrator | 2026-01-06 00:56:58.405927 | orchestrator | TASK [ceph-container-common : Ensure tmpfiles.d is present] ******************** 2026-01-06 00:56:58.405931 | orchestrator | Tuesday 06 January 2026 00:46:52 +0000 (0:00:01.368) 0:01:35.488 ******* 2026-01-06 00:56:58.405935 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.405942 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.405948 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.405954 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.405960 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.405972 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.405979 | orchestrator | 2026-01-06 00:56:58.405986 | orchestrator | TASK [ceph-container-common : Restore certificates selinux context] ************ 2026-01-06 00:56:58.405993 | orchestrator | Tuesday 06 January 2026 00:46:54 +0000 (0:00:01.557) 0:01:37.046 ******* 2026-01-06 00:56:58.405999 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406005 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406012 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406057 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406063 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406070 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406077 | orchestrator | 2026-01-06 00:56:58.406083 | orchestrator | TASK [ceph-container-common : Install python3 on osd nodes] ******************** 2026-01-06 00:56:58.406089 | orchestrator | Tuesday 06 January 2026 00:46:54 +0000 (0:00:00.621) 0:01:37.668 ******* 2026-01-06 00:56:58.406096 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406102 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406109 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406116 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406120 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406124 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406128 | orchestrator | 2026-01-06 00:56:58.406132 | orchestrator | TASK [ceph-container-common : Include registry.yml] **************************** 2026-01-06 00:56:58.406136 | orchestrator | Tuesday 06 January 2026 00:46:56 +0000 (0:00:01.157) 0:01:38.825 ******* 2026-01-06 00:56:58.406139 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406143 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406147 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406151 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406155 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406158 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406162 | orchestrator | 2026-01-06 00:56:58.406166 | orchestrator | TASK [ceph-container-common : Include fetch_image.yml] ************************* 2026-01-06 00:56:58.406170 | orchestrator | Tuesday 06 January 2026 00:46:56 +0000 (0:00:00.632) 0:01:39.458 ******* 2026-01-06 00:56:58.406173 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/fetch_image.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.406177 | orchestrator | 2026-01-06 00:56:58.406181 | orchestrator | TASK [ceph-container-common : Pulling Ceph container image] ******************** 2026-01-06 00:56:58.406185 | orchestrator | Tuesday 06 January 2026 00:46:58 +0000 (0:00:01.358) 0:01:40.817 ******* 2026-01-06 00:56:58.406189 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.406193 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.406197 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.406200 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.406204 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.406208 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.406212 | orchestrator | 2026-01-06 00:56:58.406215 | orchestrator | TASK [ceph-container-common : Pulling alertmanager/prometheus/grafana container images] *** 2026-01-06 00:56:58.406219 | orchestrator | Tuesday 06 January 2026 00:47:55 +0000 (0:00:57.922) 0:02:38.739 ******* 2026-01-06 00:56:58.406224 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406234 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406241 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406247 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406252 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406259 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406264 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406276 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406280 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406284 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406287 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406291 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406295 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406299 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406303 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406306 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406310 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406314 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406318 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406322 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406347 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-01-06 00:56:58.406351 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/prometheus:v2.7.2)  2026-01-06 00:56:58.406355 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/grafana/grafana:6.7.4)  2026-01-06 00:56:58.406359 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406363 | orchestrator | 2026-01-06 00:56:58.406366 | orchestrator | TASK [ceph-container-common : Pulling node-exporter container image] *********** 2026-01-06 00:56:58.406370 | orchestrator | Tuesday 06 January 2026 00:47:56 +0000 (0:00:00.674) 0:02:39.413 ******* 2026-01-06 00:56:58.406374 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406378 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406381 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406385 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406389 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406393 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406397 | orchestrator | 2026-01-06 00:56:58.406401 | orchestrator | TASK [ceph-container-common : Export local ceph dev image] ********************* 2026-01-06 00:56:58.406404 | orchestrator | Tuesday 06 January 2026 00:47:57 +0000 (0:00:00.877) 0:02:40.291 ******* 2026-01-06 00:56:58.406408 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406412 | orchestrator | 2026-01-06 00:56:58.406416 | orchestrator | TASK [ceph-container-common : Copy ceph dev image file] ************************ 2026-01-06 00:56:58.406420 | orchestrator | Tuesday 06 January 2026 00:47:57 +0000 (0:00:00.171) 0:02:40.462 ******* 2026-01-06 00:56:58.406423 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406427 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406432 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406436 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406439 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406443 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406447 | orchestrator | 2026-01-06 00:56:58.406451 | orchestrator | TASK [ceph-container-common : Load ceph dev image] ***************************** 2026-01-06 00:56:58.406455 | orchestrator | Tuesday 06 January 2026 00:47:58 +0000 (0:00:00.623) 0:02:41.085 ******* 2026-01-06 00:56:58.406459 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406462 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406466 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406470 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406474 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406477 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406481 | orchestrator | 2026-01-06 00:56:58.406485 | orchestrator | TASK [ceph-container-common : Remove tmp ceph dev image file] ****************** 2026-01-06 00:56:58.406492 | orchestrator | Tuesday 06 January 2026 00:47:59 +0000 (0:00:00.876) 0:02:41.962 ******* 2026-01-06 00:56:58.406496 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406500 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406504 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406508 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406511 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406515 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406519 | orchestrator | 2026-01-06 00:56:58.406538 | orchestrator | TASK [ceph-container-common : Get ceph version] ******************************** 2026-01-06 00:56:58.406546 | orchestrator | Tuesday 06 January 2026 00:47:59 +0000 (0:00:00.624) 0:02:42.587 ******* 2026-01-06 00:56:58.406552 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.406557 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.406566 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.406576 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.406582 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.406587 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.406593 | orchestrator | 2026-01-06 00:56:58.406599 | orchestrator | TASK [ceph-container-common : Set_fact ceph_version ceph_version.stdout.split] *** 2026-01-06 00:56:58.406605 | orchestrator | Tuesday 06 January 2026 00:48:02 +0000 (0:00:02.726) 0:02:45.314 ******* 2026-01-06 00:56:58.406610 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.406616 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.406622 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.406627 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.406633 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.406638 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.406644 | orchestrator | 2026-01-06 00:56:58.406656 | orchestrator | TASK [ceph-container-common : Include release.yml] ***************************** 2026-01-06 00:56:58.406662 | orchestrator | Tuesday 06 January 2026 00:48:03 +0000 (0:00:00.695) 0:02:46.009 ******* 2026-01-06 00:56:58.406668 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/release.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.406675 | orchestrator | 2026-01-06 00:56:58.406681 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release jewel] ********************* 2026-01-06 00:56:58.406688 | orchestrator | Tuesday 06 January 2026 00:48:04 +0000 (0:00:01.332) 0:02:47.341 ******* 2026-01-06 00:56:58.406694 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406701 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406707 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406713 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406719 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406725 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406731 | orchestrator | 2026-01-06 00:56:58.406737 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release kraken] ******************** 2026-01-06 00:56:58.406743 | orchestrator | Tuesday 06 January 2026 00:48:05 +0000 (0:00:00.895) 0:02:48.237 ******* 2026-01-06 00:56:58.406749 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406755 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406761 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406767 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406774 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406778 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406782 | orchestrator | 2026-01-06 00:56:58.406785 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release luminous] ****************** 2026-01-06 00:56:58.406789 | orchestrator | Tuesday 06 January 2026 00:48:06 +0000 (0:00:00.631) 0:02:48.869 ******* 2026-01-06 00:56:58.406793 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406797 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406825 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406830 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406833 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406843 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406847 | orchestrator | 2026-01-06 00:56:58.406850 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release mimic] ********************* 2026-01-06 00:56:58.406854 | orchestrator | Tuesday 06 January 2026 00:48:06 +0000 (0:00:00.884) 0:02:49.753 ******* 2026-01-06 00:56:58.406858 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406862 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406866 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406869 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406873 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406877 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406881 | orchestrator | 2026-01-06 00:56:58.406884 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release nautilus] ****************** 2026-01-06 00:56:58.406888 | orchestrator | Tuesday 06 January 2026 00:48:07 +0000 (0:00:00.607) 0:02:50.361 ******* 2026-01-06 00:56:58.406892 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406896 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406900 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406903 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406907 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406911 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406915 | orchestrator | 2026-01-06 00:56:58.406919 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release octopus] ******************* 2026-01-06 00:56:58.406923 | orchestrator | Tuesday 06 January 2026 00:48:08 +0000 (0:00:00.765) 0:02:51.126 ******* 2026-01-06 00:56:58.406926 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406930 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406934 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406938 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406941 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406945 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406949 | orchestrator | 2026-01-06 00:56:58.406952 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release pacific] ******************* 2026-01-06 00:56:58.406956 | orchestrator | Tuesday 06 January 2026 00:48:08 +0000 (0:00:00.618) 0:02:51.745 ******* 2026-01-06 00:56:58.406960 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406964 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.406968 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.406971 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.406975 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.406979 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.406983 | orchestrator | 2026-01-06 00:56:58.406987 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release quincy] ******************** 2026-01-06 00:56:58.406990 | orchestrator | Tuesday 06 January 2026 00:48:09 +0000 (0:00:00.854) 0:02:52.599 ******* 2026-01-06 00:56:58.406994 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.406998 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407002 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407005 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407009 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407013 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407017 | orchestrator | 2026-01-06 00:56:58.407020 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release reef] ********************** 2026-01-06 00:56:58.407024 | orchestrator | Tuesday 06 January 2026 00:48:10 +0000 (0:00:00.758) 0:02:53.358 ******* 2026-01-06 00:56:58.407028 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.407032 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407036 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.407040 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.407044 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.407047 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.407051 | orchestrator | 2026-01-06 00:56:58.407055 | orchestrator | TASK [ceph-config : Include create_ceph_initial_dirs.yml] ********************** 2026-01-06 00:56:58.407063 | orchestrator | Tuesday 06 January 2026 00:48:11 +0000 (0:00:01.147) 0:02:54.505 ******* 2026-01-06 00:56:58.407070 | orchestrator | included: /ansible/roles/ceph-config/tasks/create_ceph_initial_dirs.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.407075 | orchestrator | 2026-01-06 00:56:58.407079 | orchestrator | TASK [ceph-config : Create ceph initial directories] *************************** 2026-01-06 00:56:58.407082 | orchestrator | Tuesday 06 January 2026 00:48:12 +0000 (0:00:01.243) 0:02:55.749 ******* 2026-01-06 00:56:58.407086 | orchestrator | changed: [testbed-node-4] => (item=/etc/ceph) 2026-01-06 00:56:58.407091 | orchestrator | changed: [testbed-node-3] => (item=/etc/ceph) 2026-01-06 00:56:58.407095 | orchestrator | changed: [testbed-node-5] => (item=/etc/ceph) 2026-01-06 00:56:58.407098 | orchestrator | changed: [testbed-node-1] => (item=/etc/ceph) 2026-01-06 00:56:58.407102 | orchestrator | changed: [testbed-node-0] => (item=/etc/ceph) 2026-01-06 00:56:58.407106 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407110 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407114 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407118 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407121 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407125 | orchestrator | changed: [testbed-node-2] => (item=/etc/ceph) 2026-01-06 00:56:58.407129 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407133 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407137 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407140 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407144 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407148 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/) 2026-01-06 00:56:58.407152 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407172 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407176 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407180 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407184 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407187 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mon) 2026-01-06 00:56:58.407191 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407195 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407199 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407202 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407206 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407210 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/osd) 2026-01-06 00:56:58.407214 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407217 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407221 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407225 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407229 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407232 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407236 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mds) 2026-01-06 00:56:58.407240 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407246 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407250 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407260 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407264 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407268 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407271 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/tmp) 2026-01-06 00:56:58.407276 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407279 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407283 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407287 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407291 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407294 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/crash) 2026-01-06 00:56:58.407298 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407302 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407306 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407309 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407313 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407317 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/radosgw) 2026-01-06 00:56:58.407320 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407324 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407328 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407335 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407339 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407342 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rgw) 2026-01-06 00:56:58.407346 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407350 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407354 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407357 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407361 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407365 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mgr) 2026-01-06 00:56:58.407368 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407372 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407376 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407380 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407383 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mds) 2026-01-06 00:56:58.407387 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407391 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407395 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407399 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407414 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407418 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-osd) 2026-01-06 00:56:58.407426 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407430 | orchestrator | changed: [testbed-node-3] => (item=/var/run/ceph) 2026-01-06 00:56:58.407433 | orchestrator | changed: [testbed-node-4] => (item=/var/run/ceph) 2026-01-06 00:56:58.407437 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407441 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407444 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd) 2026-01-06 00:56:58.407448 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407452 | orchestrator | changed: [testbed-node-3] => (item=/var/log/ceph) 2026-01-06 00:56:58.407456 | orchestrator | changed: [testbed-node-4] => (item=/var/log/ceph) 2026-01-06 00:56:58.407460 | orchestrator | changed: [testbed-node-5] => (item=/var/run/ceph) 2026-01-06 00:56:58.407463 | orchestrator | changed: [testbed-node-0] => (item=/var/run/ceph) 2026-01-06 00:56:58.407467 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-01-06 00:56:58.407471 | orchestrator | changed: [testbed-node-1] => (item=/var/run/ceph) 2026-01-06 00:56:58.407475 | orchestrator | changed: [testbed-node-2] => (item=/var/run/ceph) 2026-01-06 00:56:58.407478 | orchestrator | changed: [testbed-node-5] => (item=/var/log/ceph) 2026-01-06 00:56:58.407482 | orchestrator | changed: [testbed-node-0] => (item=/var/log/ceph) 2026-01-06 00:56:58.407486 | orchestrator | changed: [testbed-node-1] => (item=/var/log/ceph) 2026-01-06 00:56:58.407490 | orchestrator | changed: [testbed-node-2] => (item=/var/log/ceph) 2026-01-06 00:56:58.407493 | orchestrator | 2026-01-06 00:56:58.407497 | orchestrator | TASK [ceph-config : Include_tasks rgw_systemd_environment_file.yml] ************ 2026-01-06 00:56:58.407501 | orchestrator | Tuesday 06 January 2026 00:48:19 +0000 (0:00:06.742) 0:03:02.491 ******* 2026-01-06 00:56:58.407505 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407509 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407512 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407516 | orchestrator | included: /ansible/roles/ceph-config/tasks/rgw_systemd_environment_file.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.407520 | orchestrator | 2026-01-06 00:56:58.407564 | orchestrator | TASK [ceph-config : Create rados gateway instance directories] ***************** 2026-01-06 00:56:58.407568 | orchestrator | Tuesday 06 January 2026 00:48:20 +0000 (0:00:01.067) 0:03:03.559 ******* 2026-01-06 00:56:58.407572 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407577 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407581 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407585 | orchestrator | 2026-01-06 00:56:58.407588 | orchestrator | TASK [ceph-config : Generate environment file] ********************************* 2026-01-06 00:56:58.407592 | orchestrator | Tuesday 06 January 2026 00:48:21 +0000 (0:00:00.902) 0:03:04.461 ******* 2026-01-06 00:56:58.407596 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407603 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407607 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.407611 | orchestrator | 2026-01-06 00:56:58.407615 | orchestrator | TASK [ceph-config : Reset num_osds] ******************************************** 2026-01-06 00:56:58.407618 | orchestrator | Tuesday 06 January 2026 00:48:23 +0000 (0:00:01.521) 0:03:05.982 ******* 2026-01-06 00:56:58.407626 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.407630 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407634 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.407637 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407641 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407645 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407649 | orchestrator | 2026-01-06 00:56:58.407653 | orchestrator | TASK [ceph-config : Count number of osds for lvm scenario] ********************* 2026-01-06 00:56:58.407657 | orchestrator | Tuesday 06 January 2026 00:48:23 +0000 (0:00:00.630) 0:03:06.614 ******* 2026-01-06 00:56:58.407660 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.407664 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407668 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.407672 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407676 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407679 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407683 | orchestrator | 2026-01-06 00:56:58.407687 | orchestrator | TASK [ceph-config : Look up for ceph-volume rejected devices] ****************** 2026-01-06 00:56:58.407691 | orchestrator | Tuesday 06 January 2026 00:48:24 +0000 (0:00:00.779) 0:03:07.393 ******* 2026-01-06 00:56:58.407695 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407699 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407702 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407706 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407710 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407714 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407717 | orchestrator | 2026-01-06 00:56:58.407737 | orchestrator | TASK [ceph-config : Set_fact rejected_devices] ********************************* 2026-01-06 00:56:58.407742 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:00.504) 0:03:07.898 ******* 2026-01-06 00:56:58.407745 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407749 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407753 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407757 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407760 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407764 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407768 | orchestrator | 2026-01-06 00:56:58.407772 | orchestrator | TASK [ceph-config : Set_fact _devices] ***************************************** 2026-01-06 00:56:58.407776 | orchestrator | Tuesday 06 January 2026 00:48:25 +0000 (0:00:00.772) 0:03:08.671 ******* 2026-01-06 00:56:58.407780 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407783 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407787 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407791 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407795 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407799 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407803 | orchestrator | 2026-01-06 00:56:58.407806 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm batch --report' to see how many osds are to be created] *** 2026-01-06 00:56:58.407810 | orchestrator | Tuesday 06 January 2026 00:48:26 +0000 (0:00:00.677) 0:03:09.348 ******* 2026-01-06 00:56:58.407814 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407818 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407822 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407826 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407829 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407833 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407837 | orchestrator | 2026-01-06 00:56:58.407841 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (legacy report)] *** 2026-01-06 00:56:58.407845 | orchestrator | Tuesday 06 January 2026 00:48:27 +0000 (0:00:00.951) 0:03:10.299 ******* 2026-01-06 00:56:58.407849 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407852 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407856 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407864 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407868 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407872 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407876 | orchestrator | 2026-01-06 00:56:58.407879 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (new report)] *** 2026-01-06 00:56:58.407883 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.676) 0:03:10.975 ******* 2026-01-06 00:56:58.407887 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.407891 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.407895 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.407899 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407902 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407906 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407910 | orchestrator | 2026-01-06 00:56:58.407914 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created] *** 2026-01-06 00:56:58.407917 | orchestrator | Tuesday 06 January 2026 00:48:28 +0000 (0:00:00.829) 0:03:11.805 ******* 2026-01-06 00:56:58.407921 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407925 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407929 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407933 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.407936 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407940 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.407944 | orchestrator | 2026-01-06 00:56:58.407948 | orchestrator | TASK [ceph-config : Set_fact num_osds (add existing osds)] ********************* 2026-01-06 00:56:58.407952 | orchestrator | Tuesday 06 January 2026 00:48:32 +0000 (0:00:03.207) 0:03:15.012 ******* 2026-01-06 00:56:58.407955 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.407959 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407963 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.407967 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.407971 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.407974 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.407978 | orchestrator | 2026-01-06 00:56:58.407986 | orchestrator | TASK [ceph-config : Set_fact _osd_memory_target] ******************************* 2026-01-06 00:56:58.407990 | orchestrator | Tuesday 06 January 2026 00:48:32 +0000 (0:00:00.749) 0:03:15.761 ******* 2026-01-06 00:56:58.407994 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.407998 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.408001 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.408005 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408009 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408013 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408017 | orchestrator | 2026-01-06 00:56:58.408020 | orchestrator | TASK [ceph-config : Set osd_memory_target to cluster host config] ************** 2026-01-06 00:56:58.408024 | orchestrator | Tuesday 06 January 2026 00:48:33 +0000 (0:00:00.762) 0:03:16.524 ******* 2026-01-06 00:56:58.408028 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408032 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408036 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408039 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408043 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408047 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408051 | orchestrator | 2026-01-06 00:56:58.408054 | orchestrator | TASK [ceph-config : Render rgw configs] **************************************** 2026-01-06 00:56:58.408058 | orchestrator | Tuesday 06 January 2026 00:48:34 +0000 (0:00:00.993) 0:03:17.518 ******* 2026-01-06 00:56:58.408062 | orchestrator | ok: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.408066 | orchestrator | ok: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.408070 | orchestrator | ok: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.408107 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408132 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408139 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408144 | orchestrator | 2026-01-06 00:56:58.408151 | orchestrator | TASK [ceph-config : Set config to cluster] ************************************* 2026-01-06 00:56:58.408157 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.552) 0:03:18.070 ******* 2026-01-06 00:56:58.408165 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log'}])  2026-01-06 00:56:58.408174 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.13:8081'}])  2026-01-06 00:56:58.408181 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408186 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log'}])  2026-01-06 00:56:58.408190 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.14:8081'}])  2026-01-06 00:56:58.408194 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408198 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log'}])  2026-01-06 00:56:58.408202 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.15:8081'}])  2026-01-06 00:56:58.408205 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408209 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408214 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408218 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408223 | orchestrator | 2026-01-06 00:56:58.408229 | orchestrator | TASK [ceph-config : Set rgw configs to file] *********************************** 2026-01-06 00:56:58.408233 | orchestrator | Tuesday 06 January 2026 00:48:35 +0000 (0:00:00.736) 0:03:18.806 ******* 2026-01-06 00:56:58.408237 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408241 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408245 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408248 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408252 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408259 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408263 | orchestrator | 2026-01-06 00:56:58.408267 | orchestrator | TASK [ceph-config : Create ceph conf directory] ******************************** 2026-01-06 00:56:58.408271 | orchestrator | Tuesday 06 January 2026 00:48:36 +0000 (0:00:00.578) 0:03:19.385 ******* 2026-01-06 00:56:58.408275 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408279 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408287 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408291 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408295 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408299 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408303 | orchestrator | 2026-01-06 00:56:58.408307 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-01-06 00:56:58.408311 | orchestrator | Tuesday 06 January 2026 00:48:37 +0000 (0:00:00.780) 0:03:20.165 ******* 2026-01-06 00:56:58.408316 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408320 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408324 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408328 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408331 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408336 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408340 | orchestrator | 2026-01-06 00:56:58.408344 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-01-06 00:56:58.408348 | orchestrator | Tuesday 06 January 2026 00:48:37 +0000 (0:00:00.634) 0:03:20.800 ******* 2026-01-06 00:56:58.408351 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408356 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408360 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408364 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408368 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408372 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408376 | orchestrator | 2026-01-06 00:56:58.408380 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-01-06 00:56:58.408402 | orchestrator | Tuesday 06 January 2026 00:48:38 +0000 (0:00:00.696) 0:03:21.496 ******* 2026-01-06 00:56:58.408407 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408411 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408416 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408420 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408424 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408428 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408435 | orchestrator | 2026-01-06 00:56:58.408441 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-01-06 00:56:58.408448 | orchestrator | Tuesday 06 January 2026 00:48:39 +0000 (0:00:00.585) 0:03:22.082 ******* 2026-01-06 00:56:58.408455 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.408460 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.408467 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408473 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.408480 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408486 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408492 | orchestrator | 2026-01-06 00:56:58.408501 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-01-06 00:56:58.408509 | orchestrator | Tuesday 06 January 2026 00:48:39 +0000 (0:00:00.709) 0:03:22.792 ******* 2026-01-06 00:56:58.408516 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.408536 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.408549 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.408558 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408567 | orchestrator | 2026-01-06 00:56:58.408573 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-01-06 00:56:58.408579 | orchestrator | Tuesday 06 January 2026 00:48:40 +0000 (0:00:00.356) 0:03:23.148 ******* 2026-01-06 00:56:58.408585 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.408592 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.408598 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.408604 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408634 | orchestrator | 2026-01-06 00:56:58.408643 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-01-06 00:56:58.408647 | orchestrator | Tuesday 06 January 2026 00:48:40 +0000 (0:00:00.407) 0:03:23.556 ******* 2026-01-06 00:56:58.408651 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.408655 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.408659 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.408662 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408666 | orchestrator | 2026-01-06 00:56:58.408670 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-01-06 00:56:58.408674 | orchestrator | Tuesday 06 January 2026 00:48:41 +0000 (0:00:00.370) 0:03:23.926 ******* 2026-01-06 00:56:58.408678 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.408682 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.408685 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.408689 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408693 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408697 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408701 | orchestrator | 2026-01-06 00:56:58.408705 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-01-06 00:56:58.408709 | orchestrator | Tuesday 06 January 2026 00:48:41 +0000 (0:00:00.687) 0:03:24.614 ******* 2026-01-06 00:56:58.408713 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-01-06 00:56:58.408718 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-01-06 00:56:58.408725 | orchestrator | skipping: [testbed-node-0] => (item=0)  2026-01-06 00:56:58.408732 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-01-06 00:56:58.408742 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.408749 | orchestrator | skipping: [testbed-node-1] => (item=0)  2026-01-06 00:56:58.408755 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.408761 | orchestrator | skipping: [testbed-node-2] => (item=0)  2026-01-06 00:56:58.408772 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.408778 | orchestrator | 2026-01-06 00:56:58.408784 | orchestrator | TASK [ceph-config : Generate Ceph file] **************************************** 2026-01-06 00:56:58.408789 | orchestrator | Tuesday 06 January 2026 00:48:43 +0000 (0:00:01.963) 0:03:26.577 ******* 2026-01-06 00:56:58.408796 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.408802 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.408807 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.408813 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.408819 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.408825 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.408831 | orchestrator | 2026-01-06 00:56:58.408837 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.408843 | orchestrator | Tuesday 06 January 2026 00:48:47 +0000 (0:00:03.762) 0:03:30.340 ******* 2026-01-06 00:56:58.408850 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.408856 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.408861 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.408867 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.408874 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.408880 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.408884 | orchestrator | 2026-01-06 00:56:58.408888 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-01-06 00:56:58.408891 | orchestrator | Tuesday 06 January 2026 00:48:48 +0000 (0:00:01.077) 0:03:31.418 ******* 2026-01-06 00:56:58.408895 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.408899 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.408903 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.408907 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.408911 | orchestrator | 2026-01-06 00:56:58.408915 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-01-06 00:56:58.408959 | orchestrator | Tuesday 06 January 2026 00:48:49 +0000 (0:00:00.996) 0:03:32.414 ******* 2026-01-06 00:56:58.408964 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.408968 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.408971 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.408976 | orchestrator | 2026-01-06 00:56:58.408980 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-01-06 00:56:58.408983 | orchestrator | Tuesday 06 January 2026 00:48:49 +0000 (0:00:00.323) 0:03:32.738 ******* 2026-01-06 00:56:58.408987 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.408991 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.408995 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.408999 | orchestrator | 2026-01-06 00:56:58.409003 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-01-06 00:56:58.409007 | orchestrator | Tuesday 06 January 2026 00:48:51 +0000 (0:00:01.550) 0:03:34.288 ******* 2026-01-06 00:56:58.409011 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:56:58.409015 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:56:58.409019 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:56:58.409023 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409027 | orchestrator | 2026-01-06 00:56:58.409031 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-01-06 00:56:58.409035 | orchestrator | Tuesday 06 January 2026 00:48:52 +0000 (0:00:00.573) 0:03:34.862 ******* 2026-01-06 00:56:58.409039 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.409043 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.409047 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.409051 | orchestrator | 2026-01-06 00:56:58.409055 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-01-06 00:56:58.409058 | orchestrator | Tuesday 06 January 2026 00:48:52 +0000 (0:00:00.371) 0:03:35.234 ******* 2026-01-06 00:56:58.409062 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409066 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.409070 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.409073 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.409077 | orchestrator | 2026-01-06 00:56:58.409081 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-01-06 00:56:58.409085 | orchestrator | Tuesday 06 January 2026 00:48:53 +0000 (0:00:00.930) 0:03:36.164 ******* 2026-01-06 00:56:58.409089 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.409093 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.409097 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.409184 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409189 | orchestrator | 2026-01-06 00:56:58.409193 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-01-06 00:56:58.409197 | orchestrator | Tuesday 06 January 2026 00:48:53 +0000 (0:00:00.383) 0:03:36.547 ******* 2026-01-06 00:56:58.409200 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409204 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.409208 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.409212 | orchestrator | 2026-01-06 00:56:58.409216 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-01-06 00:56:58.409220 | orchestrator | Tuesday 06 January 2026 00:48:54 +0000 (0:00:00.393) 0:03:36.941 ******* 2026-01-06 00:56:58.409224 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409228 | orchestrator | 2026-01-06 00:56:58.409232 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-01-06 00:56:58.409236 | orchestrator | Tuesday 06 January 2026 00:48:54 +0000 (0:00:00.229) 0:03:37.171 ******* 2026-01-06 00:56:58.409240 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409248 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.409252 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.409256 | orchestrator | 2026-01-06 00:56:58.409260 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-01-06 00:56:58.409268 | orchestrator | Tuesday 06 January 2026 00:48:54 +0000 (0:00:00.322) 0:03:37.493 ******* 2026-01-06 00:56:58.409273 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409276 | orchestrator | 2026-01-06 00:56:58.409280 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-01-06 00:56:58.409284 | orchestrator | Tuesday 06 January 2026 00:48:54 +0000 (0:00:00.234) 0:03:37.728 ******* 2026-01-06 00:56:58.409288 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409292 | orchestrator | 2026-01-06 00:56:58.409296 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-01-06 00:56:58.409300 | orchestrator | Tuesday 06 January 2026 00:48:55 +0000 (0:00:00.231) 0:03:37.959 ******* 2026-01-06 00:56:58.409304 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409308 | orchestrator | 2026-01-06 00:56:58.409311 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-01-06 00:56:58.409315 | orchestrator | Tuesday 06 January 2026 00:48:55 +0000 (0:00:00.137) 0:03:38.096 ******* 2026-01-06 00:56:58.409319 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409323 | orchestrator | 2026-01-06 00:56:58.409327 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-01-06 00:56:58.409331 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:00.857) 0:03:38.954 ******* 2026-01-06 00:56:58.409334 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409338 | orchestrator | 2026-01-06 00:56:58.409342 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-01-06 00:56:58.409346 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:00.229) 0:03:39.183 ******* 2026-01-06 00:56:58.409350 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.409354 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.409358 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.409362 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409365 | orchestrator | 2026-01-06 00:56:58.409369 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-01-06 00:56:58.409394 | orchestrator | Tuesday 06 January 2026 00:48:56 +0000 (0:00:00.556) 0:03:39.740 ******* 2026-01-06 00:56:58.409398 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409402 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.409406 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.409410 | orchestrator | 2026-01-06 00:56:58.409414 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-01-06 00:56:58.409418 | orchestrator | Tuesday 06 January 2026 00:48:57 +0000 (0:00:00.470) 0:03:40.211 ******* 2026-01-06 00:56:58.409421 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409425 | orchestrator | 2026-01-06 00:56:58.409429 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-01-06 00:56:58.409433 | orchestrator | Tuesday 06 January 2026 00:48:57 +0000 (0:00:00.231) 0:03:40.443 ******* 2026-01-06 00:56:58.409437 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409440 | orchestrator | 2026-01-06 00:56:58.409444 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-01-06 00:56:58.409448 | orchestrator | Tuesday 06 January 2026 00:48:57 +0000 (0:00:00.223) 0:03:40.666 ******* 2026-01-06 00:56:58.409452 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409456 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.409460 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.409464 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.409468 | orchestrator | 2026-01-06 00:56:58.409471 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-01-06 00:56:58.409482 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:01.259) 0:03:41.925 ******* 2026-01-06 00:56:58.409486 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.409490 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.409494 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.409498 | orchestrator | 2026-01-06 00:56:58.409502 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-01-06 00:56:58.409506 | orchestrator | Tuesday 06 January 2026 00:48:59 +0000 (0:00:00.370) 0:03:42.296 ******* 2026-01-06 00:56:58.409509 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.409513 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.409517 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.409521 | orchestrator | 2026-01-06 00:56:58.409554 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-01-06 00:56:58.409558 | orchestrator | Tuesday 06 January 2026 00:49:00 +0000 (0:00:01.379) 0:03:43.675 ******* 2026-01-06 00:56:58.409562 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.409566 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.409570 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.409574 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409578 | orchestrator | 2026-01-06 00:56:58.409582 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-01-06 00:56:58.409585 | orchestrator | Tuesday 06 January 2026 00:49:01 +0000 (0:00:00.996) 0:03:44.672 ******* 2026-01-06 00:56:58.409589 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.409593 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.409597 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.409601 | orchestrator | 2026-01-06 00:56:58.409605 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-01-06 00:56:58.409609 | orchestrator | Tuesday 06 January 2026 00:49:02 +0000 (0:00:00.727) 0:03:45.399 ******* 2026-01-06 00:56:58.409612 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409616 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.409620 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.409625 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.409629 | orchestrator | 2026-01-06 00:56:58.409661 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-01-06 00:56:58.409665 | orchestrator | Tuesday 06 January 2026 00:49:03 +0000 (0:00:00.943) 0:03:46.342 ******* 2026-01-06 00:56:58.409673 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.409677 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.409681 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.409685 | orchestrator | 2026-01-06 00:56:58.409689 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-01-06 00:56:58.409693 | orchestrator | Tuesday 06 January 2026 00:49:04 +0000 (0:00:00.662) 0:03:47.005 ******* 2026-01-06 00:56:58.409697 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.409701 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.409705 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.409708 | orchestrator | 2026-01-06 00:56:58.409712 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-01-06 00:56:58.409716 | orchestrator | Tuesday 06 January 2026 00:49:05 +0000 (0:00:01.414) 0:03:48.419 ******* 2026-01-06 00:56:58.409721 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.409742 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.409748 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.409752 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409756 | orchestrator | 2026-01-06 00:56:58.409760 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-01-06 00:56:58.409768 | orchestrator | Tuesday 06 January 2026 00:49:06 +0000 (0:00:00.674) 0:03:49.094 ******* 2026-01-06 00:56:58.409773 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.409777 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.409781 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.409785 | orchestrator | 2026-01-06 00:56:58.409789 | orchestrator | RUNNING HANDLER [ceph-handler : Rbdmirrors handler] **************************** 2026-01-06 00:56:58.409793 | orchestrator | Tuesday 06 January 2026 00:49:06 +0000 (0:00:00.408) 0:03:49.502 ******* 2026-01-06 00:56:58.409797 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409800 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.409804 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.409809 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409813 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.409839 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.409844 | orchestrator | 2026-01-06 00:56:58.409848 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-01-06 00:56:58.409852 | orchestrator | Tuesday 06 January 2026 00:49:07 +0000 (0:00:00.976) 0:03:50.479 ******* 2026-01-06 00:56:58.409856 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.409859 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.409864 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.409868 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.409872 | orchestrator | 2026-01-06 00:56:58.409876 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-01-06 00:56:58.409879 | orchestrator | Tuesday 06 January 2026 00:49:08 +0000 (0:00:00.835) 0:03:51.315 ******* 2026-01-06 00:56:58.409883 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.409887 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.409891 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.409895 | orchestrator | 2026-01-06 00:56:58.409899 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-01-06 00:56:58.409903 | orchestrator | Tuesday 06 January 2026 00:49:09 +0000 (0:00:00.777) 0:03:52.092 ******* 2026-01-06 00:56:58.409906 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.409910 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.409914 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.409918 | orchestrator | 2026-01-06 00:56:58.409922 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-01-06 00:56:58.409925 | orchestrator | Tuesday 06 January 2026 00:49:10 +0000 (0:00:01.465) 0:03:53.558 ******* 2026-01-06 00:56:58.409930 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:56:58.409933 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:56:58.409937 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:56:58.409941 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.409945 | orchestrator | 2026-01-06 00:56:58.409949 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-01-06 00:56:58.409952 | orchestrator | Tuesday 06 January 2026 00:49:11 +0000 (0:00:00.662) 0:03:54.221 ******* 2026-01-06 00:56:58.409956 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.409960 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.409964 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.409968 | orchestrator | 2026-01-06 00:56:58.409971 | orchestrator | PLAY [Apply role ceph-mon] ***************************************************** 2026-01-06 00:56:58.409975 | orchestrator | 2026-01-06 00:56:58.409979 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.409983 | orchestrator | Tuesday 06 January 2026 00:49:11 +0000 (0:00:00.567) 0:03:54.788 ******* 2026-01-06 00:56:58.409988 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.409992 | orchestrator | 2026-01-06 00:56:58.409996 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.410003 | orchestrator | Tuesday 06 January 2026 00:49:13 +0000 (0:00:01.048) 0:03:55.837 ******* 2026-01-06 00:56:58.410007 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.410036 | orchestrator | 2026-01-06 00:56:58.410042 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.410046 | orchestrator | Tuesday 06 January 2026 00:49:13 +0000 (0:00:00.572) 0:03:56.410 ******* 2026-01-06 00:56:58.410050 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410054 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410058 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410062 | orchestrator | 2026-01-06 00:56:58.410066 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.410070 | orchestrator | Tuesday 06 January 2026 00:49:14 +0000 (0:00:01.221) 0:03:57.631 ******* 2026-01-06 00:56:58.410087 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410092 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410096 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410102 | orchestrator | 2026-01-06 00:56:58.410109 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.410115 | orchestrator | Tuesday 06 January 2026 00:49:15 +0000 (0:00:00.369) 0:03:58.000 ******* 2026-01-06 00:56:58.410120 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410125 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410131 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410137 | orchestrator | 2026-01-06 00:56:58.410143 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.410149 | orchestrator | Tuesday 06 January 2026 00:49:15 +0000 (0:00:00.472) 0:03:58.473 ******* 2026-01-06 00:56:58.410154 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410160 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410165 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410171 | orchestrator | 2026-01-06 00:56:58.410176 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.410183 | orchestrator | Tuesday 06 January 2026 00:49:16 +0000 (0:00:00.379) 0:03:58.852 ******* 2026-01-06 00:56:58.410189 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410194 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410201 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410222 | orchestrator | 2026-01-06 00:56:58.410228 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.410261 | orchestrator | Tuesday 06 January 2026 00:49:17 +0000 (0:00:01.443) 0:04:00.296 ******* 2026-01-06 00:56:58.410267 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410274 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410278 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410282 | orchestrator | 2026-01-06 00:56:58.410286 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.410290 | orchestrator | Tuesday 06 January 2026 00:49:17 +0000 (0:00:00.419) 0:04:00.715 ******* 2026-01-06 00:56:58.410359 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410396 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410403 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410410 | orchestrator | 2026-01-06 00:56:58.410414 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.410418 | orchestrator | Tuesday 06 January 2026 00:49:18 +0000 (0:00:00.443) 0:04:01.158 ******* 2026-01-06 00:56:58.410423 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410426 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410431 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410434 | orchestrator | 2026-01-06 00:56:58.410438 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.410442 | orchestrator | Tuesday 06 January 2026 00:49:19 +0000 (0:00:00.909) 0:04:02.068 ******* 2026-01-06 00:56:58.410453 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410457 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410461 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410465 | orchestrator | 2026-01-06 00:56:58.410470 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.410474 | orchestrator | Tuesday 06 January 2026 00:49:20 +0000 (0:00:01.246) 0:04:03.314 ******* 2026-01-06 00:56:58.410480 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410483 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410487 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410491 | orchestrator | 2026-01-06 00:56:58.410495 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.410499 | orchestrator | Tuesday 06 January 2026 00:49:21 +0000 (0:00:00.661) 0:04:03.975 ******* 2026-01-06 00:56:58.410503 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410507 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410511 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410516 | orchestrator | 2026-01-06 00:56:58.410522 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.410541 | orchestrator | Tuesday 06 January 2026 00:49:21 +0000 (0:00:00.422) 0:04:04.398 ******* 2026-01-06 00:56:58.410548 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410554 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410560 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410566 | orchestrator | 2026-01-06 00:56:58.410572 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.410578 | orchestrator | Tuesday 06 January 2026 00:49:22 +0000 (0:00:00.421) 0:04:04.820 ******* 2026-01-06 00:56:58.410584 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410590 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410596 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410601 | orchestrator | 2026-01-06 00:56:58.410608 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.410613 | orchestrator | Tuesday 06 January 2026 00:49:22 +0000 (0:00:00.325) 0:04:05.145 ******* 2026-01-06 00:56:58.410618 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410621 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410626 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410630 | orchestrator | 2026-01-06 00:56:58.410634 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.410638 | orchestrator | Tuesday 06 January 2026 00:49:23 +0000 (0:00:00.746) 0:04:05.892 ******* 2026-01-06 00:56:58.410642 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410645 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410649 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410653 | orchestrator | 2026-01-06 00:56:58.410657 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.410661 | orchestrator | Tuesday 06 January 2026 00:49:23 +0000 (0:00:00.337) 0:04:06.229 ******* 2026-01-06 00:56:58.410665 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410668 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.410672 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.410676 | orchestrator | 2026-01-06 00:56:58.410680 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.410684 | orchestrator | Tuesday 06 January 2026 00:49:23 +0000 (0:00:00.462) 0:04:06.692 ******* 2026-01-06 00:56:58.410688 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410697 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410702 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410705 | orchestrator | 2026-01-06 00:56:58.410709 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.410713 | orchestrator | Tuesday 06 January 2026 00:49:24 +0000 (0:00:00.393) 0:04:07.085 ******* 2026-01-06 00:56:58.410717 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410725 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410729 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410732 | orchestrator | 2026-01-06 00:56:58.410736 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.410740 | orchestrator | Tuesday 06 January 2026 00:49:24 +0000 (0:00:00.615) 0:04:07.701 ******* 2026-01-06 00:56:58.410744 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410748 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410752 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410778 | orchestrator | 2026-01-06 00:56:58.410784 | orchestrator | TASK [ceph-mon : Set_fact container_exec_cmd] ********************************** 2026-01-06 00:56:58.410788 | orchestrator | Tuesday 06 January 2026 00:49:25 +0000 (0:00:00.537) 0:04:08.238 ******* 2026-01-06 00:56:58.410792 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410796 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410800 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410804 | orchestrator | 2026-01-06 00:56:58.410808 | orchestrator | TASK [ceph-mon : Include deploy_monitors.yml] ********************************** 2026-01-06 00:56:58.410812 | orchestrator | Tuesday 06 January 2026 00:49:25 +0000 (0:00:00.326) 0:04:08.565 ******* 2026-01-06 00:56:58.410816 | orchestrator | included: /ansible/roles/ceph-mon/tasks/deploy_monitors.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.410820 | orchestrator | 2026-01-06 00:56:58.410824 | orchestrator | TASK [ceph-mon : Check if monitor initial keyring already exists] ************** 2026-01-06 00:56:58.410827 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:00.761) 0:04:09.326 ******* 2026-01-06 00:56:58.410831 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.410835 | orchestrator | 2026-01-06 00:56:58.410861 | orchestrator | TASK [ceph-mon : Generate monitor initial keyring] ***************************** 2026-01-06 00:56:58.410866 | orchestrator | Tuesday 06 January 2026 00:49:26 +0000 (0:00:00.142) 0:04:09.469 ******* 2026-01-06 00:56:58.410870 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-01-06 00:56:58.410874 | orchestrator | 2026-01-06 00:56:58.410877 | orchestrator | TASK [ceph-mon : Set_fact _initial_mon_key_success] **************************** 2026-01-06 00:56:58.410882 | orchestrator | Tuesday 06 January 2026 00:49:27 +0000 (0:00:00.940) 0:04:10.409 ******* 2026-01-06 00:56:58.410885 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410889 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410893 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410897 | orchestrator | 2026-01-06 00:56:58.410901 | orchestrator | TASK [ceph-mon : Get initial keyring when it already exists] ******************* 2026-01-06 00:56:58.410904 | orchestrator | Tuesday 06 January 2026 00:49:27 +0000 (0:00:00.375) 0:04:10.785 ******* 2026-01-06 00:56:58.410908 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.410912 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.410916 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.410919 | orchestrator | 2026-01-06 00:56:58.410923 | orchestrator | TASK [ceph-mon : Create monitor initial keyring] ******************************* 2026-01-06 00:56:58.410927 | orchestrator | Tuesday 06 January 2026 00:49:28 +0000 (0:00:00.590) 0:04:11.376 ******* 2026-01-06 00:56:58.410931 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.410935 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.410938 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.410942 | orchestrator | 2026-01-06 00:56:58.410946 | orchestrator | TASK [ceph-mon : Copy the initial key in /etc/ceph (for containers)] *********** 2026-01-06 00:56:58.410950 | orchestrator | Tuesday 06 January 2026 00:49:29 +0000 (0:00:01.259) 0:04:12.635 ******* 2026-01-06 00:56:58.410954 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.410958 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.410962 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.410966 | orchestrator | 2026-01-06 00:56:58.410969 | orchestrator | TASK [ceph-mon : Create monitor directory] ************************************* 2026-01-06 00:56:58.410973 | orchestrator | Tuesday 06 January 2026 00:49:31 +0000 (0:00:01.205) 0:04:13.840 ******* 2026-01-06 00:56:58.410977 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.410986 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.410990 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.410993 | orchestrator | 2026-01-06 00:56:58.410997 | orchestrator | TASK [ceph-mon : Recursively fix ownership of monitor directory] *************** 2026-01-06 00:56:58.411001 | orchestrator | Tuesday 06 January 2026 00:49:31 +0000 (0:00:00.829) 0:04:14.670 ******* 2026-01-06 00:56:58.411005 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411009 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411013 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411017 | orchestrator | 2026-01-06 00:56:58.411021 | orchestrator | TASK [ceph-mon : Create admin keyring] ***************************************** 2026-01-06 00:56:58.411025 | orchestrator | Tuesday 06 January 2026 00:49:33 +0000 (0:00:01.148) 0:04:15.818 ******* 2026-01-06 00:56:58.411029 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411033 | orchestrator | 2026-01-06 00:56:58.411037 | orchestrator | TASK [ceph-mon : Slurp admin keyring] ****************************************** 2026-01-06 00:56:58.411041 | orchestrator | Tuesday 06 January 2026 00:49:34 +0000 (0:00:01.744) 0:04:17.562 ******* 2026-01-06 00:56:58.411045 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411048 | orchestrator | 2026-01-06 00:56:58.411052 | orchestrator | TASK [ceph-mon : Copy admin keyring over to mons] ****************************** 2026-01-06 00:56:58.411056 | orchestrator | Tuesday 06 January 2026 00:49:35 +0000 (0:00:00.647) 0:04:18.209 ******* 2026-01-06 00:56:58.411060 | orchestrator | changed: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.411063 | orchestrator | ok: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.411068 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.411072 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-01-06 00:56:58.411077 | orchestrator | ok: [testbed-node-2 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:56:58.411083 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:56:58.411088 | orchestrator | changed: [testbed-node-1 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:56:58.411092 | orchestrator | changed: [testbed-node-1 -> {{ item }}] 2026-01-06 00:56:58.411095 | orchestrator | ok: [testbed-node-2] => (item=None) 2026-01-06 00:56:58.411100 | orchestrator | ok: [testbed-node-2 -> {{ item }}] 2026-01-06 00:56:58.411104 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:56:58.411108 | orchestrator | ok: [testbed-node-0 -> {{ item }}] 2026-01-06 00:56:58.411112 | orchestrator | 2026-01-06 00:56:58.411117 | orchestrator | TASK [ceph-mon : Import admin keyring into mon keyring] ************************ 2026-01-06 00:56:58.411124 | orchestrator | Tuesday 06 January 2026 00:49:39 +0000 (0:00:04.038) 0:04:22.248 ******* 2026-01-06 00:56:58.411130 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411136 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411142 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411148 | orchestrator | 2026-01-06 00:56:58.411155 | orchestrator | TASK [ceph-mon : Set_fact ceph-mon container command] ************************** 2026-01-06 00:56:58.411172 | orchestrator | Tuesday 06 January 2026 00:49:40 +0000 (0:00:01.251) 0:04:23.500 ******* 2026-01-06 00:56:58.411211 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411225 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411233 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411237 | orchestrator | 2026-01-06 00:56:58.411241 | orchestrator | TASK [ceph-mon : Set_fact monmaptool container command] ************************ 2026-01-06 00:56:58.411245 | orchestrator | Tuesday 06 January 2026 00:49:41 +0000 (0:00:00.319) 0:04:23.819 ******* 2026-01-06 00:56:58.411249 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411252 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411256 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411260 | orchestrator | 2026-01-06 00:56:58.411264 | orchestrator | TASK [ceph-mon : Generate initial monmap] ************************************** 2026-01-06 00:56:58.411268 | orchestrator | Tuesday 06 January 2026 00:49:41 +0000 (0:00:00.503) 0:04:24.323 ******* 2026-01-06 00:56:58.411302 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411330 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411335 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411339 | orchestrator | 2026-01-06 00:56:58.411342 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs with keyring] ******************************* 2026-01-06 00:56:58.411346 | orchestrator | Tuesday 06 January 2026 00:49:43 +0000 (0:00:01.625) 0:04:25.948 ******* 2026-01-06 00:56:58.411350 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411354 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411358 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411362 | orchestrator | 2026-01-06 00:56:58.411366 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs without keyring] **************************** 2026-01-06 00:56:58.411370 | orchestrator | Tuesday 06 January 2026 00:49:44 +0000 (0:00:01.312) 0:04:27.261 ******* 2026-01-06 00:56:58.411374 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411378 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411382 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411386 | orchestrator | 2026-01-06 00:56:58.411390 | orchestrator | TASK [ceph-mon : Include start_monitor.yml] ************************************ 2026-01-06 00:56:58.411394 | orchestrator | Tuesday 06 January 2026 00:49:44 +0000 (0:00:00.474) 0:04:27.735 ******* 2026-01-06 00:56:58.411398 | orchestrator | included: /ansible/roles/ceph-mon/tasks/start_monitor.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.411402 | orchestrator | 2026-01-06 00:56:58.411406 | orchestrator | TASK [ceph-mon : Ensure systemd service override directory exists] ************* 2026-01-06 00:56:58.411410 | orchestrator | Tuesday 06 January 2026 00:49:45 +0000 (0:00:00.760) 0:04:28.495 ******* 2026-01-06 00:56:58.411413 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411417 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411421 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411425 | orchestrator | 2026-01-06 00:56:58.411429 | orchestrator | TASK [ceph-mon : Add ceph-mon systemd service overrides] *********************** 2026-01-06 00:56:58.411433 | orchestrator | Tuesday 06 January 2026 00:49:46 +0000 (0:00:00.337) 0:04:28.833 ******* 2026-01-06 00:56:58.411437 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411441 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411445 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411449 | orchestrator | 2026-01-06 00:56:58.411453 | orchestrator | TASK [ceph-mon : Include_tasks systemd.yml] ************************************ 2026-01-06 00:56:58.411457 | orchestrator | Tuesday 06 January 2026 00:49:46 +0000 (0:00:00.341) 0:04:29.175 ******* 2026-01-06 00:56:58.411461 | orchestrator | included: /ansible/roles/ceph-mon/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.411466 | orchestrator | 2026-01-06 00:56:58.411470 | orchestrator | TASK [ceph-mon : Generate systemd unit file for mon container] ***************** 2026-01-06 00:56:58.411474 | orchestrator | Tuesday 06 January 2026 00:49:47 +0000 (0:00:00.891) 0:04:30.066 ******* 2026-01-06 00:56:58.411478 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411481 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411485 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411489 | orchestrator | 2026-01-06 00:56:58.411493 | orchestrator | TASK [ceph-mon : Generate systemd ceph-mon target file] ************************ 2026-01-06 00:56:58.411497 | orchestrator | Tuesday 06 January 2026 00:49:49 +0000 (0:00:01.850) 0:04:31.916 ******* 2026-01-06 00:56:58.411500 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411504 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411508 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411512 | orchestrator | 2026-01-06 00:56:58.411516 | orchestrator | TASK [ceph-mon : Enable ceph-mon.target] *************************************** 2026-01-06 00:56:58.411520 | orchestrator | Tuesday 06 January 2026 00:49:50 +0000 (0:00:01.429) 0:04:33.346 ******* 2026-01-06 00:56:58.411566 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411571 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411582 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411586 | orchestrator | 2026-01-06 00:56:58.411590 | orchestrator | TASK [ceph-mon : Start the monitor service] ************************************ 2026-01-06 00:56:58.411594 | orchestrator | Tuesday 06 January 2026 00:49:52 +0000 (0:00:01.934) 0:04:35.280 ******* 2026-01-06 00:56:58.411602 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.411606 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.411610 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.411613 | orchestrator | 2026-01-06 00:56:58.411617 | orchestrator | TASK [ceph-mon : Include_tasks ceph_keys.yml] ********************************** 2026-01-06 00:56:58.411621 | orchestrator | Tuesday 06 January 2026 00:49:54 +0000 (0:00:02.111) 0:04:37.392 ******* 2026-01-06 00:56:58.411625 | orchestrator | included: /ansible/roles/ceph-mon/tasks/ceph_keys.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.411629 | orchestrator | 2026-01-06 00:56:58.411632 | orchestrator | TASK [ceph-mon : Waiting for the monitor(s) to form the quorum...] ************* 2026-01-06 00:56:58.411636 | orchestrator | Tuesday 06 January 2026 00:49:55 +0000 (0:00:00.513) 0:04:37.906 ******* 2026-01-06 00:56:58.411640 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for the monitor(s) to form the quorum... (10 retries left). 2026-01-06 00:56:58.411644 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411648 | orchestrator | 2026-01-06 00:56:58.411651 | orchestrator | TASK [ceph-mon : Fetch ceph initial keys] ************************************** 2026-01-06 00:56:58.411655 | orchestrator | Tuesday 06 January 2026 00:50:17 +0000 (0:00:22.055) 0:04:59.961 ******* 2026-01-06 00:56:58.411659 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411663 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411667 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411672 | orchestrator | 2026-01-06 00:56:58.411675 | orchestrator | TASK [ceph-mon : Include secure_cluster.yml] *********************************** 2026-01-06 00:56:58.411679 | orchestrator | Tuesday 06 January 2026 00:50:26 +0000 (0:00:09.690) 0:05:09.652 ******* 2026-01-06 00:56:58.411683 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411687 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411691 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411694 | orchestrator | 2026-01-06 00:56:58.411698 | orchestrator | TASK [ceph-mon : Set cluster configs] ****************************************** 2026-01-06 00:56:58.411722 | orchestrator | Tuesday 06 January 2026 00:50:27 +0000 (0:00:00.619) 0:05:10.272 ******* 2026-01-06 00:56:58.411729 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'public_network', 'value': '192.168.16.0/20'}]) 2026-01-06 00:56:58.411737 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'cluster_network', 'value': '192.168.16.0/20'}]) 2026-01-06 00:56:58.411744 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'osd_pool_default_crush_rule', 'value': -1}]) 2026-01-06 00:56:58.411752 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'ms_bind_ipv6', 'value': 'False'}]) 2026-01-06 00:56:58.411774 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'ms_bind_ipv4', 'value': 'True'}]) 2026-01-06 00:56:58.411781 | orchestrator | skipping: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}}, {'key': 'osd_crush_chooseleaf_type', 'value': '__omit_place_holder__2cf7ada09558bb28b1613c1ac05cc1e5bc05d39d'}])  2026-01-06 00:56:58.411789 | orchestrator | 2026-01-06 00:56:58.411795 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.411801 | orchestrator | Tuesday 06 January 2026 00:50:42 +0000 (0:00:14.769) 0:05:25.041 ******* 2026-01-06 00:56:58.411807 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411813 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411818 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411824 | orchestrator | 2026-01-06 00:56:58.411834 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-01-06 00:56:58.411840 | orchestrator | Tuesday 06 January 2026 00:50:42 +0000 (0:00:00.392) 0:05:25.434 ******* 2026-01-06 00:56:58.411845 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.411852 | orchestrator | 2026-01-06 00:56:58.411857 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-01-06 00:56:58.411863 | orchestrator | Tuesday 06 January 2026 00:50:43 +0000 (0:00:00.880) 0:05:26.314 ******* 2026-01-06 00:56:58.411869 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411875 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411881 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411888 | orchestrator | 2026-01-06 00:56:58.411894 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-01-06 00:56:58.411901 | orchestrator | Tuesday 06 January 2026 00:50:44 +0000 (0:00:00.550) 0:05:26.864 ******* 2026-01-06 00:56:58.411907 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.411914 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411918 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.411922 | orchestrator | 2026-01-06 00:56:58.411926 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-01-06 00:56:58.411929 | orchestrator | Tuesday 06 January 2026 00:50:44 +0000 (0:00:00.360) 0:05:27.225 ******* 2026-01-06 00:56:58.411934 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:56:58.411937 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:56:58.411941 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:56:58.411945 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.411949 | orchestrator | 2026-01-06 00:56:58.411953 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-01-06 00:56:58.411957 | orchestrator | Tuesday 06 January 2026 00:50:45 +0000 (0:00:01.006) 0:05:28.231 ******* 2026-01-06 00:56:58.411961 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.411965 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.411988 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.411993 | orchestrator | 2026-01-06 00:56:58.411997 | orchestrator | PLAY [Apply role ceph-mgr] ***************************************************** 2026-01-06 00:56:58.412001 | orchestrator | 2026-01-06 00:56:58.412005 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.412039 | orchestrator | Tuesday 06 January 2026 00:50:46 +0000 (0:00:01.047) 0:05:29.279 ******* 2026-01-06 00:56:58.412063 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.412068 | orchestrator | 2026-01-06 00:56:58.412072 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.412076 | orchestrator | Tuesday 06 January 2026 00:50:47 +0000 (0:00:00.596) 0:05:29.875 ******* 2026-01-06 00:56:58.412080 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.412084 | orchestrator | 2026-01-06 00:56:58.412088 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.412092 | orchestrator | Tuesday 06 January 2026 00:50:47 +0000 (0:00:00.905) 0:05:30.781 ******* 2026-01-06 00:56:58.412096 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412100 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412104 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412113 | orchestrator | 2026-01-06 00:56:58.412118 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.412121 | orchestrator | Tuesday 06 January 2026 00:50:48 +0000 (0:00:00.765) 0:05:31.546 ******* 2026-01-06 00:56:58.412126 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412130 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412134 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412140 | orchestrator | 2026-01-06 00:56:58.412146 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.412152 | orchestrator | Tuesday 06 January 2026 00:50:49 +0000 (0:00:00.379) 0:05:31.925 ******* 2026-01-06 00:56:58.412157 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412163 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412169 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412174 | orchestrator | 2026-01-06 00:56:58.412180 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.412186 | orchestrator | Tuesday 06 January 2026 00:50:49 +0000 (0:00:00.661) 0:05:32.586 ******* 2026-01-06 00:56:58.412191 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412196 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412201 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412206 | orchestrator | 2026-01-06 00:56:58.412212 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.412218 | orchestrator | Tuesday 06 January 2026 00:50:50 +0000 (0:00:00.403) 0:05:32.990 ******* 2026-01-06 00:56:58.412225 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412231 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412236 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412243 | orchestrator | 2026-01-06 00:56:58.412249 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.412255 | orchestrator | Tuesday 06 January 2026 00:50:50 +0000 (0:00:00.812) 0:05:33.802 ******* 2026-01-06 00:56:58.412262 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412268 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412274 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412280 | orchestrator | 2026-01-06 00:56:58.412286 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.412293 | orchestrator | Tuesday 06 January 2026 00:50:51 +0000 (0:00:00.363) 0:05:34.166 ******* 2026-01-06 00:56:58.412298 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412304 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412311 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412315 | orchestrator | 2026-01-06 00:56:58.412319 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.412327 | orchestrator | Tuesday 06 January 2026 00:50:52 +0000 (0:00:00.690) 0:05:34.856 ******* 2026-01-06 00:56:58.412331 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412335 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412339 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412348 | orchestrator | 2026-01-06 00:56:58.412352 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.412356 | orchestrator | Tuesday 06 January 2026 00:50:52 +0000 (0:00:00.747) 0:05:35.604 ******* 2026-01-06 00:56:58.412360 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412363 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412367 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412371 | orchestrator | 2026-01-06 00:56:58.412375 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.412378 | orchestrator | Tuesday 06 January 2026 00:50:53 +0000 (0:00:00.723) 0:05:36.327 ******* 2026-01-06 00:56:58.412382 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412386 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412390 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412393 | orchestrator | 2026-01-06 00:56:58.412397 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.412401 | orchestrator | Tuesday 06 January 2026 00:50:53 +0000 (0:00:00.326) 0:05:36.654 ******* 2026-01-06 00:56:58.412405 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412408 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412412 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412416 | orchestrator | 2026-01-06 00:56:58.412420 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.412424 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.692) 0:05:37.347 ******* 2026-01-06 00:56:58.412427 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412431 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412435 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412439 | orchestrator | 2026-01-06 00:56:58.412443 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.412473 | orchestrator | Tuesday 06 January 2026 00:50:54 +0000 (0:00:00.390) 0:05:37.737 ******* 2026-01-06 00:56:58.412478 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412482 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412486 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412489 | orchestrator | 2026-01-06 00:56:58.412517 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.412522 | orchestrator | Tuesday 06 January 2026 00:50:55 +0000 (0:00:00.444) 0:05:38.182 ******* 2026-01-06 00:56:58.412546 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412550 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412553 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412557 | orchestrator | 2026-01-06 00:56:58.412561 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.412565 | orchestrator | Tuesday 06 January 2026 00:50:55 +0000 (0:00:00.330) 0:05:38.512 ******* 2026-01-06 00:56:58.412568 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412572 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412576 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412580 | orchestrator | 2026-01-06 00:56:58.412584 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.412587 | orchestrator | Tuesday 06 January 2026 00:50:56 +0000 (0:00:00.300) 0:05:38.812 ******* 2026-01-06 00:56:58.412591 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412595 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412599 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412602 | orchestrator | 2026-01-06 00:56:58.412606 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.412610 | orchestrator | Tuesday 06 January 2026 00:50:56 +0000 (0:00:00.637) 0:05:39.450 ******* 2026-01-06 00:56:58.412614 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412617 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412621 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412625 | orchestrator | 2026-01-06 00:56:58.412629 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.412639 | orchestrator | Tuesday 06 January 2026 00:50:57 +0000 (0:00:00.398) 0:05:39.849 ******* 2026-01-06 00:56:58.412643 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412646 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412650 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412654 | orchestrator | 2026-01-06 00:56:58.412658 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.412662 | orchestrator | Tuesday 06 January 2026 00:50:57 +0000 (0:00:00.402) 0:05:40.252 ******* 2026-01-06 00:56:58.412665 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412669 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412673 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412677 | orchestrator | 2026-01-06 00:56:58.412681 | orchestrator | TASK [ceph-mgr : Set_fact container_exec_cmd] ********************************** 2026-01-06 00:56:58.412684 | orchestrator | Tuesday 06 January 2026 00:50:58 +0000 (0:00:00.748) 0:05:41.000 ******* 2026-01-06 00:56:58.412688 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-01-06 00:56:58.412692 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.412696 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.412700 | orchestrator | 2026-01-06 00:56:58.412704 | orchestrator | TASK [ceph-mgr : Include common.yml] ******************************************* 2026-01-06 00:56:58.412708 | orchestrator | Tuesday 06 January 2026 00:50:58 +0000 (0:00:00.630) 0:05:41.630 ******* 2026-01-06 00:56:58.412712 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/common.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.412716 | orchestrator | 2026-01-06 00:56:58.412720 | orchestrator | TASK [ceph-mgr : Create mgr directory] ***************************************** 2026-01-06 00:56:58.412724 | orchestrator | Tuesday 06 January 2026 00:50:59 +0000 (0:00:00.410) 0:05:42.040 ******* 2026-01-06 00:56:58.412727 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.412731 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.412735 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.412738 | orchestrator | 2026-01-06 00:56:58.412745 | orchestrator | TASK [ceph-mgr : Fetch ceph mgr keyring] *************************************** 2026-01-06 00:56:58.412750 | orchestrator | Tuesday 06 January 2026 00:50:59 +0000 (0:00:00.608) 0:05:42.649 ******* 2026-01-06 00:56:58.412753 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412757 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412761 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412764 | orchestrator | 2026-01-06 00:56:58.412768 | orchestrator | TASK [ceph-mgr : Create ceph mgr keyring(s) on a mon node] ********************* 2026-01-06 00:56:58.412772 | orchestrator | Tuesday 06 January 2026 00:51:00 +0000 (0:00:00.431) 0:05:43.081 ******* 2026-01-06 00:56:58.412776 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.412780 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.412784 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.412788 | orchestrator | changed: [testbed-node-0 -> {{ groups[mon_group_name][0] }}] 2026-01-06 00:56:58.412792 | orchestrator | 2026-01-06 00:56:58.412796 | orchestrator | TASK [ceph-mgr : Set_fact _mgr_keys] ******************************************* 2026-01-06 00:56:58.412799 | orchestrator | Tuesday 06 January 2026 00:51:10 +0000 (0:00:10.679) 0:05:53.761 ******* 2026-01-06 00:56:58.412803 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412807 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412811 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412814 | orchestrator | 2026-01-06 00:56:58.412818 | orchestrator | TASK [ceph-mgr : Get keys from monitors] *************************************** 2026-01-06 00:56:58.412822 | orchestrator | Tuesday 06 January 2026 00:51:11 +0000 (0:00:00.315) 0:05:54.076 ******* 2026-01-06 00:56:58.412826 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-01-06 00:56:58.412830 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-01-06 00:56:58.412839 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-01-06 00:56:58.412843 | orchestrator | ok: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.412847 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.412874 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.412879 | orchestrator | 2026-01-06 00:56:58.412883 | orchestrator | TASK [ceph-mgr : Copy ceph key(s) if needed] *********************************** 2026-01-06 00:56:58.412887 | orchestrator | Tuesday 06 January 2026 00:51:13 +0000 (0:00:02.289) 0:05:56.366 ******* 2026-01-06 00:56:58.412891 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-01-06 00:56:58.412895 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-01-06 00:56:58.412898 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-01-06 00:56:58.412902 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-01-06 00:56:58.412909 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-01-06 00:56:58.412915 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-01-06 00:56:58.412921 | orchestrator | 2026-01-06 00:56:58.412926 | orchestrator | TASK [ceph-mgr : Set mgr key permissions] ************************************** 2026-01-06 00:56:58.412933 | orchestrator | Tuesday 06 January 2026 00:51:14 +0000 (0:00:01.176) 0:05:57.542 ******* 2026-01-06 00:56:58.412938 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.412944 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.412950 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.412956 | orchestrator | 2026-01-06 00:56:58.412961 | orchestrator | TASK [ceph-mgr : Append dashboard modules to ceph_mgr_modules] ***************** 2026-01-06 00:56:58.412967 | orchestrator | Tuesday 06 January 2026 00:51:15 +0000 (0:00:00.947) 0:05:58.489 ******* 2026-01-06 00:56:58.412973 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.412978 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.412984 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.412989 | orchestrator | 2026-01-06 00:56:58.412996 | orchestrator | TASK [ceph-mgr : Include pre_requisite.yml] ************************************ 2026-01-06 00:56:58.413002 | orchestrator | Tuesday 06 January 2026 00:51:15 +0000 (0:00:00.269) 0:05:58.759 ******* 2026-01-06 00:56:58.413008 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.413013 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.413019 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.413025 | orchestrator | 2026-01-06 00:56:58.413030 | orchestrator | TASK [ceph-mgr : Include start_mgr.yml] **************************************** 2026-01-06 00:56:58.413037 | orchestrator | Tuesday 06 January 2026 00:51:16 +0000 (0:00:00.316) 0:05:59.076 ******* 2026-01-06 00:56:58.413043 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/start_mgr.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.413050 | orchestrator | 2026-01-06 00:56:58.413055 | orchestrator | TASK [ceph-mgr : Ensure systemd service override directory exists] ************* 2026-01-06 00:56:58.413061 | orchestrator | Tuesday 06 January 2026 00:51:17 +0000 (0:00:00.892) 0:05:59.968 ******* 2026-01-06 00:56:58.413067 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.413072 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.413078 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.413084 | orchestrator | 2026-01-06 00:56:58.413091 | orchestrator | TASK [ceph-mgr : Add ceph-mgr systemd service overrides] *********************** 2026-01-06 00:56:58.413097 | orchestrator | Tuesday 06 January 2026 00:51:17 +0000 (0:00:00.428) 0:06:00.397 ******* 2026-01-06 00:56:58.413103 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.413109 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.413115 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.413121 | orchestrator | 2026-01-06 00:56:58.413125 | orchestrator | TASK [ceph-mgr : Include_tasks systemd.yml] ************************************ 2026-01-06 00:56:58.413129 | orchestrator | Tuesday 06 January 2026 00:51:17 +0000 (0:00:00.388) 0:06:00.785 ******* 2026-01-06 00:56:58.413133 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.413143 | orchestrator | 2026-01-06 00:56:58.413147 | orchestrator | TASK [ceph-mgr : Generate systemd unit file] *********************************** 2026-01-06 00:56:58.413151 | orchestrator | Tuesday 06 January 2026 00:51:18 +0000 (0:00:00.932) 0:06:01.718 ******* 2026-01-06 00:56:58.413154 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413158 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413162 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413166 | orchestrator | 2026-01-06 00:56:58.413175 | orchestrator | TASK [ceph-mgr : Generate systemd ceph-mgr target file] ************************ 2026-01-06 00:56:58.413178 | orchestrator | Tuesday 06 January 2026 00:51:20 +0000 (0:00:01.690) 0:06:03.408 ******* 2026-01-06 00:56:58.413183 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413186 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413190 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413194 | orchestrator | 2026-01-06 00:56:58.413198 | orchestrator | TASK [ceph-mgr : Enable ceph-mgr.target] *************************************** 2026-01-06 00:56:58.413201 | orchestrator | Tuesday 06 January 2026 00:51:22 +0000 (0:00:01.567) 0:06:04.975 ******* 2026-01-06 00:56:58.413205 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413209 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413213 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413217 | orchestrator | 2026-01-06 00:56:58.413221 | orchestrator | TASK [ceph-mgr : Systemd start mgr] ******************************************** 2026-01-06 00:56:58.413225 | orchestrator | Tuesday 06 January 2026 00:51:24 +0000 (0:00:01.878) 0:06:06.853 ******* 2026-01-06 00:56:58.413228 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413232 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413236 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413240 | orchestrator | 2026-01-06 00:56:58.413244 | orchestrator | TASK [ceph-mgr : Include mgr_modules.yml] ************************************** 2026-01-06 00:56:58.413248 | orchestrator | Tuesday 06 January 2026 00:51:26 +0000 (0:00:02.305) 0:06:09.159 ******* 2026-01-06 00:56:58.413252 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.413256 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.413259 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/mgr_modules.yml for testbed-node-2 2026-01-06 00:56:58.413263 | orchestrator | 2026-01-06 00:56:58.413267 | orchestrator | TASK [ceph-mgr : Wait for all mgr to be up] ************************************ 2026-01-06 00:56:58.413271 | orchestrator | Tuesday 06 January 2026 00:51:26 +0000 (0:00:00.420) 0:06:09.579 ******* 2026-01-06 00:56:58.413299 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (30 retries left). 2026-01-06 00:56:58.413304 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (29 retries left). 2026-01-06 00:56:58.413308 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (28 retries left). 2026-01-06 00:56:58.413312 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (27 retries left). 2026-01-06 00:56:58.413316 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (26 retries left). 2026-01-06 00:56:58.413320 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.413324 | orchestrator | 2026-01-06 00:56:58.413328 | orchestrator | TASK [ceph-mgr : Get enabled modules from ceph-mgr] **************************** 2026-01-06 00:56:58.413331 | orchestrator | Tuesday 06 January 2026 00:51:57 +0000 (0:00:30.340) 0:06:39.919 ******* 2026-01-06 00:56:58.413335 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.413339 | orchestrator | 2026-01-06 00:56:58.413343 | orchestrator | TASK [ceph-mgr : Set _ceph_mgr_modules fact (convert _ceph_mgr_modules.stdout to a dict)] *** 2026-01-06 00:56:58.413346 | orchestrator | Tuesday 06 January 2026 00:51:58 +0000 (0:00:01.351) 0:06:41.271 ******* 2026-01-06 00:56:58.413350 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.413354 | orchestrator | 2026-01-06 00:56:58.413362 | orchestrator | TASK [ceph-mgr : Set _disabled_ceph_mgr_modules fact] ************************** 2026-01-06 00:56:58.413366 | orchestrator | Tuesday 06 January 2026 00:51:58 +0000 (0:00:00.336) 0:06:41.608 ******* 2026-01-06 00:56:58.413370 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.413373 | orchestrator | 2026-01-06 00:56:58.413377 | orchestrator | TASK [ceph-mgr : Disable ceph mgr enabled modules] ***************************** 2026-01-06 00:56:58.413381 | orchestrator | Tuesday 06 January 2026 00:51:58 +0000 (0:00:00.150) 0:06:41.759 ******* 2026-01-06 00:56:58.413385 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=iostat) 2026-01-06 00:56:58.413389 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=nfs) 2026-01-06 00:56:58.413393 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=restful) 2026-01-06 00:56:58.413396 | orchestrator | 2026-01-06 00:56:58.413400 | orchestrator | TASK [ceph-mgr : Add modules to ceph-mgr] ************************************** 2026-01-06 00:56:58.413404 | orchestrator | Tuesday 06 January 2026 00:52:05 +0000 (0:00:06.741) 0:06:48.500 ******* 2026-01-06 00:56:58.413408 | orchestrator | skipping: [testbed-node-2] => (item=balancer)  2026-01-06 00:56:58.413412 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=dashboard) 2026-01-06 00:56:58.413415 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=prometheus) 2026-01-06 00:56:58.413421 | orchestrator | skipping: [testbed-node-2] => (item=status)  2026-01-06 00:56:58.413428 | orchestrator | 2026-01-06 00:56:58.413434 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.413440 | orchestrator | Tuesday 06 January 2026 00:52:11 +0000 (0:00:05.456) 0:06:53.957 ******* 2026-01-06 00:56:58.413445 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413451 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413457 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413464 | orchestrator | 2026-01-06 00:56:58.413471 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-01-06 00:56:58.413477 | orchestrator | Tuesday 06 January 2026 00:52:11 +0000 (0:00:00.768) 0:06:54.725 ******* 2026-01-06 00:56:58.413483 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.413490 | orchestrator | 2026-01-06 00:56:58.413495 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-01-06 00:56:58.413504 | orchestrator | Tuesday 06 January 2026 00:52:12 +0000 (0:00:00.866) 0:06:55.592 ******* 2026-01-06 00:56:58.413508 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.413511 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.413515 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.413519 | orchestrator | 2026-01-06 00:56:58.413543 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-01-06 00:56:58.413547 | orchestrator | Tuesday 06 January 2026 00:52:13 +0000 (0:00:00.356) 0:06:55.948 ******* 2026-01-06 00:56:58.413551 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.413555 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.413559 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.413563 | orchestrator | 2026-01-06 00:56:58.413566 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-01-06 00:56:58.413570 | orchestrator | Tuesday 06 January 2026 00:52:14 +0000 (0:00:01.140) 0:06:57.089 ******* 2026-01-06 00:56:58.413574 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:56:58.413578 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:56:58.413582 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:56:58.413585 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.413589 | orchestrator | 2026-01-06 00:56:58.413593 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-01-06 00:56:58.413597 | orchestrator | Tuesday 06 January 2026 00:52:14 +0000 (0:00:00.562) 0:06:57.651 ******* 2026-01-06 00:56:58.413607 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.413611 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.413615 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.413619 | orchestrator | 2026-01-06 00:56:58.413623 | orchestrator | PLAY [Apply role ceph-osd] ***************************************************** 2026-01-06 00:56:58.413626 | orchestrator | 2026-01-06 00:56:58.413630 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.413634 | orchestrator | Tuesday 06 January 2026 00:52:15 +0000 (0:00:00.711) 0:06:58.363 ******* 2026-01-06 00:56:58.413659 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.413664 | orchestrator | 2026-01-06 00:56:58.413668 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.413672 | orchestrator | Tuesday 06 January 2026 00:52:16 +0000 (0:00:00.471) 0:06:58.835 ******* 2026-01-06 00:56:58.413676 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.413680 | orchestrator | 2026-01-06 00:56:58.413683 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.413687 | orchestrator | Tuesday 06 January 2026 00:52:16 +0000 (0:00:00.648) 0:06:59.483 ******* 2026-01-06 00:56:58.413691 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.413695 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.413701 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.413707 | orchestrator | 2026-01-06 00:56:58.413713 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.413719 | orchestrator | Tuesday 06 January 2026 00:52:16 +0000 (0:00:00.247) 0:06:59.730 ******* 2026-01-06 00:56:58.413726 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.413732 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.413739 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.413745 | orchestrator | 2026-01-06 00:56:58.413751 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.413754 | orchestrator | Tuesday 06 January 2026 00:52:17 +0000 (0:00:00.626) 0:07:00.356 ******* 2026-01-06 00:56:58.413760 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.413766 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.413772 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.413778 | orchestrator | 2026-01-06 00:56:58.413784 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.413791 | orchestrator | Tuesday 06 January 2026 00:52:18 +0000 (0:00:00.654) 0:07:01.011 ******* 2026-01-06 00:56:58.413797 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.413803 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.413807 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.413811 | orchestrator | 2026-01-06 00:56:58.413815 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.413819 | orchestrator | Tuesday 06 January 2026 00:52:19 +0000 (0:00:01.210) 0:07:02.222 ******* 2026-01-06 00:56:58.413822 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.413826 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.413830 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.413834 | orchestrator | 2026-01-06 00:56:58.413837 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.413841 | orchestrator | Tuesday 06 January 2026 00:52:19 +0000 (0:00:00.291) 0:07:02.514 ******* 2026-01-06 00:56:58.413845 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.413849 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.413855 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.413861 | orchestrator | 2026-01-06 00:56:58.413866 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.413871 | orchestrator | Tuesday 06 January 2026 00:52:19 +0000 (0:00:00.267) 0:07:02.782 ******* 2026-01-06 00:56:58.413877 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.413887 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.413893 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.413899 | orchestrator | 2026-01-06 00:56:58.413904 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.413911 | orchestrator | Tuesday 06 January 2026 00:52:20 +0000 (0:00:00.277) 0:07:03.060 ******* 2026-01-06 00:56:58.413916 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.413922 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.413927 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.413933 | orchestrator | 2026-01-06 00:56:58.413939 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.413946 | orchestrator | Tuesday 06 January 2026 00:52:21 +0000 (0:00:01.013) 0:07:04.073 ******* 2026-01-06 00:56:58.413952 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.413964 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.413971 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.413976 | orchestrator | 2026-01-06 00:56:58.413982 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.413988 | orchestrator | Tuesday 06 January 2026 00:52:21 +0000 (0:00:00.732) 0:07:04.806 ******* 2026-01-06 00:56:58.413994 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.413999 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414005 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414012 | orchestrator | 2026-01-06 00:56:58.414063 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.414068 | orchestrator | Tuesday 06 January 2026 00:52:22 +0000 (0:00:00.361) 0:07:05.167 ******* 2026-01-06 00:56:58.414072 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414075 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414079 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414084 | orchestrator | 2026-01-06 00:56:58.414090 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.414096 | orchestrator | Tuesday 06 January 2026 00:52:22 +0000 (0:00:00.327) 0:07:05.495 ******* 2026-01-06 00:56:58.414102 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414108 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414114 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414120 | orchestrator | 2026-01-06 00:56:58.414126 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.414132 | orchestrator | Tuesday 06 January 2026 00:52:23 +0000 (0:00:00.719) 0:07:06.214 ******* 2026-01-06 00:56:58.414137 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414143 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414149 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414154 | orchestrator | 2026-01-06 00:56:58.414161 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.414168 | orchestrator | Tuesday 06 January 2026 00:52:23 +0000 (0:00:00.424) 0:07:06.639 ******* 2026-01-06 00:56:58.414174 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414180 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414195 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414201 | orchestrator | 2026-01-06 00:56:58.414207 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.414213 | orchestrator | Tuesday 06 January 2026 00:52:24 +0000 (0:00:00.401) 0:07:07.040 ******* 2026-01-06 00:56:58.414219 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414225 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414231 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414237 | orchestrator | 2026-01-06 00:56:58.414248 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.414254 | orchestrator | Tuesday 06 January 2026 00:52:24 +0000 (0:00:00.329) 0:07:07.370 ******* 2026-01-06 00:56:58.414261 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414266 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414280 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414284 | orchestrator | 2026-01-06 00:56:58.414288 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.414292 | orchestrator | Tuesday 06 January 2026 00:52:25 +0000 (0:00:00.640) 0:07:08.011 ******* 2026-01-06 00:56:58.414296 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414299 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414303 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414307 | orchestrator | 2026-01-06 00:56:58.414311 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.414314 | orchestrator | Tuesday 06 January 2026 00:52:25 +0000 (0:00:00.337) 0:07:08.349 ******* 2026-01-06 00:56:58.414318 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414322 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414326 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414330 | orchestrator | 2026-01-06 00:56:58.414333 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.414337 | orchestrator | Tuesday 06 January 2026 00:52:25 +0000 (0:00:00.363) 0:07:08.712 ******* 2026-01-06 00:56:58.414343 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414349 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414355 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414362 | orchestrator | 2026-01-06 00:56:58.414366 | orchestrator | TASK [ceph-osd : Set_fact add_osd] ********************************************* 2026-01-06 00:56:58.414370 | orchestrator | Tuesday 06 January 2026 00:52:26 +0000 (0:00:00.850) 0:07:09.562 ******* 2026-01-06 00:56:58.414373 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414377 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414381 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414385 | orchestrator | 2026-01-06 00:56:58.414388 | orchestrator | TASK [ceph-osd : Set_fact container_exec_cmd] ********************************** 2026-01-06 00:56:58.414392 | orchestrator | Tuesday 06 January 2026 00:52:27 +0000 (0:00:00.426) 0:07:09.988 ******* 2026-01-06 00:56:58.414396 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:56:58.414400 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:56:58.414404 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:56:58.414407 | orchestrator | 2026-01-06 00:56:58.414411 | orchestrator | TASK [ceph-osd : Include_tasks system_tuning.yml] ****************************** 2026-01-06 00:56:58.414415 | orchestrator | Tuesday 06 January 2026 00:52:27 +0000 (0:00:00.645) 0:07:10.633 ******* 2026-01-06 00:56:58.414419 | orchestrator | included: /ansible/roles/ceph-osd/tasks/system_tuning.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.414423 | orchestrator | 2026-01-06 00:56:58.414426 | orchestrator | TASK [ceph-osd : Create tmpfiles.d directory] ********************************** 2026-01-06 00:56:58.414430 | orchestrator | Tuesday 06 January 2026 00:52:28 +0000 (0:00:00.439) 0:07:11.073 ******* 2026-01-06 00:56:58.414434 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414438 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414441 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414445 | orchestrator | 2026-01-06 00:56:58.414449 | orchestrator | TASK [ceph-osd : Disable transparent hugepage] ********************************* 2026-01-06 00:56:58.414458 | orchestrator | Tuesday 06 January 2026 00:52:28 +0000 (0:00:00.396) 0:07:11.469 ******* 2026-01-06 00:56:58.414462 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414468 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414474 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414481 | orchestrator | 2026-01-06 00:56:58.414485 | orchestrator | TASK [ceph-osd : Get default vm.min_free_kbytes] ******************************* 2026-01-06 00:56:58.414489 | orchestrator | Tuesday 06 January 2026 00:52:28 +0000 (0:00:00.230) 0:07:11.699 ******* 2026-01-06 00:56:58.414493 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414496 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414509 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414512 | orchestrator | 2026-01-06 00:56:58.414516 | orchestrator | TASK [ceph-osd : Set_fact vm_min_free_kbytes] ********************************** 2026-01-06 00:56:58.414520 | orchestrator | Tuesday 06 January 2026 00:52:29 +0000 (0:00:00.714) 0:07:12.414 ******* 2026-01-06 00:56:58.414569 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414573 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.414577 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.414581 | orchestrator | 2026-01-06 00:56:58.414585 | orchestrator | TASK [ceph-osd : Apply operating system tuning] ******************************** 2026-01-06 00:56:58.414592 | orchestrator | Tuesday 06 January 2026 00:52:29 +0000 (0:00:00.313) 0:07:12.728 ******* 2026-01-06 00:56:58.414596 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-01-06 00:56:58.414600 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-01-06 00:56:58.414604 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-01-06 00:56:58.414608 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-01-06 00:56:58.414612 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-01-06 00:56:58.414624 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-01-06 00:56:58.414630 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-01-06 00:56:58.414636 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-01-06 00:56:58.414641 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-01-06 00:56:58.414647 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-01-06 00:56:58.414653 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-01-06 00:56:58.414659 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-01-06 00:56:58.414665 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-01-06 00:56:58.414671 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-01-06 00:56:58.414677 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-01-06 00:56:58.414681 | orchestrator | 2026-01-06 00:56:58.414685 | orchestrator | TASK [ceph-osd : Install dependencies] ***************************************** 2026-01-06 00:56:58.414688 | orchestrator | Tuesday 06 January 2026 00:52:33 +0000 (0:00:03.473) 0:07:16.201 ******* 2026-01-06 00:56:58.414692 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414696 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414700 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414704 | orchestrator | 2026-01-06 00:56:58.414707 | orchestrator | TASK [ceph-osd : Include_tasks common.yml] ************************************* 2026-01-06 00:56:58.414711 | orchestrator | Tuesday 06 January 2026 00:52:33 +0000 (0:00:00.321) 0:07:16.523 ******* 2026-01-06 00:56:58.414715 | orchestrator | included: /ansible/roles/ceph-osd/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.414718 | orchestrator | 2026-01-06 00:56:58.414722 | orchestrator | TASK [ceph-osd : Create bootstrap-osd and osd directories] ********************* 2026-01-06 00:56:58.414726 | orchestrator | Tuesday 06 January 2026 00:52:34 +0000 (0:00:00.542) 0:07:17.065 ******* 2026-01-06 00:56:58.414730 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd/) 2026-01-06 00:56:58.414733 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd/) 2026-01-06 00:56:58.414738 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/osd/) 2026-01-06 00:56:58.414742 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/osd/) 2026-01-06 00:56:58.414746 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd/) 2026-01-06 00:56:58.414754 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/osd/) 2026-01-06 00:56:58.414758 | orchestrator | 2026-01-06 00:56:58.414761 | orchestrator | TASK [ceph-osd : Get keys from monitors] *************************************** 2026-01-06 00:56:58.414765 | orchestrator | Tuesday 06 January 2026 00:52:35 +0000 (0:00:01.304) 0:07:18.369 ******* 2026-01-06 00:56:58.414769 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.414773 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.414777 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.414781 | orchestrator | 2026-01-06 00:56:58.414784 | orchestrator | TASK [ceph-osd : Copy ceph key(s) if needed] *********************************** 2026-01-06 00:56:58.414788 | orchestrator | Tuesday 06 January 2026 00:52:37 +0000 (0:00:02.052) 0:07:20.422 ******* 2026-01-06 00:56:58.414792 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-01-06 00:56:58.414796 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.414799 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.414807 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-01-06 00:56:58.414811 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-01-06 00:56:58.414817 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.414822 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-01-06 00:56:58.414828 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-01-06 00:56:58.414833 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.414839 | orchestrator | 2026-01-06 00:56:58.414844 | orchestrator | TASK [ceph-osd : Set noup flag] ************************************************ 2026-01-06 00:56:58.414850 | orchestrator | Tuesday 06 January 2026 00:52:38 +0000 (0:00:01.329) 0:07:21.751 ******* 2026-01-06 00:56:58.414857 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.414863 | orchestrator | 2026-01-06 00:56:58.414869 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm.yml] ****************************** 2026-01-06 00:56:58.414876 | orchestrator | Tuesday 06 January 2026 00:52:41 +0000 (0:00:02.184) 0:07:23.936 ******* 2026-01-06 00:56:58.414881 | orchestrator | included: /ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.414888 | orchestrator | 2026-01-06 00:56:58.414894 | orchestrator | TASK [ceph-osd : Use ceph-volume to create osds] ******************************* 2026-01-06 00:56:58.414900 | orchestrator | Tuesday 06 January 2026 00:52:41 +0000 (0:00:00.616) 0:07:24.552 ******* 2026-01-06 00:56:58.414907 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382', 'data_vg': 'ceph-d44b25a4-5c87-5b50-a8b5-4ed8c19ba382'}) 2026-01-06 00:56:58.414915 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-0ba15c51-2e8d-5c95-884b-d45401cb60d9', 'data_vg': 'ceph-0ba15c51-2e8d-5c95-884b-d45401cb60d9'}) 2026-01-06 00:56:58.414925 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-64d6825f-3ec1-5927-8c89-e441ee427e8a', 'data_vg': 'ceph-64d6825f-3ec1-5927-8c89-e441ee427e8a'}) 2026-01-06 00:56:58.414929 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-1f440738-8941-5354-ae19-38cd939f8e8b', 'data_vg': 'ceph-1f440738-8941-5354-ae19-38cd939f8e8b'}) 2026-01-06 00:56:58.414933 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-588df21e-a0c0-57e7-8c43-2f77be274309', 'data_vg': 'ceph-588df21e-a0c0-57e7-8c43-2f77be274309'}) 2026-01-06 00:56:58.414937 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e675238b-4f6c-5157-bfd7-95a1b3a689b7', 'data_vg': 'ceph-e675238b-4f6c-5157-bfd7-95a1b3a689b7'}) 2026-01-06 00:56:58.414941 | orchestrator | 2026-01-06 00:56:58.414944 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm-batch.yml] ************************ 2026-01-06 00:56:58.414948 | orchestrator | Tuesday 06 January 2026 00:53:22 +0000 (0:00:41.005) 0:08:05.558 ******* 2026-01-06 00:56:58.414952 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.414961 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.414965 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.414969 | orchestrator | 2026-01-06 00:56:58.414973 | orchestrator | TASK [ceph-osd : Include_tasks start_osds.yml] ********************************* 2026-01-06 00:56:58.414976 | orchestrator | Tuesday 06 January 2026 00:53:23 +0000 (0:00:00.346) 0:08:05.904 ******* 2026-01-06 00:56:58.414980 | orchestrator | included: /ansible/roles/ceph-osd/tasks/start_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.414984 | orchestrator | 2026-01-06 00:56:58.414988 | orchestrator | TASK [ceph-osd : Get osd ids] ************************************************** 2026-01-06 00:56:58.414991 | orchestrator | Tuesday 06 January 2026 00:53:23 +0000 (0:00:00.832) 0:08:06.736 ******* 2026-01-06 00:56:58.414995 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.414999 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.415003 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.415006 | orchestrator | 2026-01-06 00:56:58.415010 | orchestrator | TASK [ceph-osd : Collect osd ids] ********************************************** 2026-01-06 00:56:58.415014 | orchestrator | Tuesday 06 January 2026 00:53:24 +0000 (0:00:00.694) 0:08:07.431 ******* 2026-01-06 00:56:58.415018 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.415021 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.415025 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.415029 | orchestrator | 2026-01-06 00:56:58.415032 | orchestrator | TASK [ceph-osd : Include_tasks systemd.yml] ************************************ 2026-01-06 00:56:58.415036 | orchestrator | Tuesday 06 January 2026 00:53:27 +0000 (0:00:02.848) 0:08:10.279 ******* 2026-01-06 00:56:58.415040 | orchestrator | included: /ansible/roles/ceph-osd/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.415044 | orchestrator | 2026-01-06 00:56:58.415047 | orchestrator | TASK [ceph-osd : Generate systemd unit file] *********************************** 2026-01-06 00:56:58.415051 | orchestrator | Tuesday 06 January 2026 00:53:28 +0000 (0:00:00.926) 0:08:11.206 ******* 2026-01-06 00:56:58.415055 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.415058 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.415062 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.415066 | orchestrator | 2026-01-06 00:56:58.415070 | orchestrator | TASK [ceph-osd : Generate systemd ceph-osd target file] ************************ 2026-01-06 00:56:58.415074 | orchestrator | Tuesday 06 January 2026 00:53:29 +0000 (0:00:01.300) 0:08:12.506 ******* 2026-01-06 00:56:58.415078 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.415081 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.415085 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.415089 | orchestrator | 2026-01-06 00:56:58.415093 | orchestrator | TASK [ceph-osd : Enable ceph-osd.target] *************************************** 2026-01-06 00:56:58.415096 | orchestrator | Tuesday 06 January 2026 00:53:31 +0000 (0:00:01.422) 0:08:13.929 ******* 2026-01-06 00:56:58.415100 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.415104 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.415108 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.415111 | orchestrator | 2026-01-06 00:56:58.415118 | orchestrator | TASK [ceph-osd : Ensure systemd service override directory exists] ************* 2026-01-06 00:56:58.415122 | orchestrator | Tuesday 06 January 2026 00:53:32 +0000 (0:00:01.834) 0:08:15.763 ******* 2026-01-06 00:56:58.415126 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415130 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415133 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415137 | orchestrator | 2026-01-06 00:56:58.415141 | orchestrator | TASK [ceph-osd : Add ceph-osd systemd service overrides] *********************** 2026-01-06 00:56:58.415144 | orchestrator | Tuesday 06 January 2026 00:53:33 +0000 (0:00:00.631) 0:08:16.394 ******* 2026-01-06 00:56:58.415148 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415152 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415156 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415159 | orchestrator | 2026-01-06 00:56:58.415163 | orchestrator | TASK [ceph-osd : Ensure /var/lib/ceph/osd/- is present] ********* 2026-01-06 00:56:58.415170 | orchestrator | Tuesday 06 January 2026 00:53:33 +0000 (0:00:00.336) 0:08:16.730 ******* 2026-01-06 00:56:58.415174 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-01-06 00:56:58.415178 | orchestrator | ok: [testbed-node-4] => (item=1) 2026-01-06 00:56:58.415182 | orchestrator | ok: [testbed-node-5] => (item=4) 2026-01-06 00:56:58.415185 | orchestrator | ok: [testbed-node-3] => (item=3) 2026-01-06 00:56:58.415189 | orchestrator | ok: [testbed-node-4] => (item=5) 2026-01-06 00:56:58.415193 | orchestrator | ok: [testbed-node-5] => (item=2) 2026-01-06 00:56:58.415196 | orchestrator | 2026-01-06 00:56:58.415200 | orchestrator | TASK [ceph-osd : Write run file in /var/lib/ceph/osd/xxxx/run] ***************** 2026-01-06 00:56:58.415204 | orchestrator | Tuesday 06 January 2026 00:53:35 +0000 (0:00:01.159) 0:08:17.889 ******* 2026-01-06 00:56:58.415208 | orchestrator | changed: [testbed-node-3] => (item=0) 2026-01-06 00:56:58.415212 | orchestrator | changed: [testbed-node-4] => (item=1) 2026-01-06 00:56:58.415215 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-01-06 00:56:58.415219 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-01-06 00:56:58.415223 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-01-06 00:56:58.415230 | orchestrator | changed: [testbed-node-5] => (item=2) 2026-01-06 00:56:58.415233 | orchestrator | 2026-01-06 00:56:58.415237 | orchestrator | TASK [ceph-osd : Systemd start osd] ******************************************** 2026-01-06 00:56:58.415243 | orchestrator | Tuesday 06 January 2026 00:53:37 +0000 (0:00:02.257) 0:08:20.147 ******* 2026-01-06 00:56:58.415249 | orchestrator | changed: [testbed-node-3] => (item=0) 2026-01-06 00:56:58.415255 | orchestrator | changed: [testbed-node-4] => (item=1) 2026-01-06 00:56:58.415261 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-01-06 00:56:58.415268 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-01-06 00:56:58.415274 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-01-06 00:56:58.415279 | orchestrator | changed: [testbed-node-5] => (item=2) 2026-01-06 00:56:58.415285 | orchestrator | 2026-01-06 00:56:58.415291 | orchestrator | TASK [ceph-osd : Unset noup flag] ********************************************** 2026-01-06 00:56:58.415297 | orchestrator | Tuesday 06 January 2026 00:53:41 +0000 (0:00:04.168) 0:08:24.315 ******* 2026-01-06 00:56:58.415303 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415309 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415315 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.415322 | orchestrator | 2026-01-06 00:56:58.415328 | orchestrator | TASK [ceph-osd : Wait for all osd to be up] ************************************ 2026-01-06 00:56:58.415334 | orchestrator | Tuesday 06 January 2026 00:53:43 +0000 (0:00:02.227) 0:08:26.543 ******* 2026-01-06 00:56:58.415340 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415346 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415351 | orchestrator | FAILED - RETRYING: [testbed-node-5 -> testbed-node-0]: Wait for all osd to be up (60 retries left). 2026-01-06 00:56:58.415355 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.415359 | orchestrator | 2026-01-06 00:56:58.415363 | orchestrator | TASK [ceph-osd : Include crush_rules.yml] ************************************** 2026-01-06 00:56:58.415366 | orchestrator | Tuesday 06 January 2026 00:53:56 +0000 (0:00:12.598) 0:08:39.141 ******* 2026-01-06 00:56:58.415371 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415376 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415382 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415389 | orchestrator | 2026-01-06 00:56:58.415395 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.415401 | orchestrator | Tuesday 06 January 2026 00:53:57 +0000 (0:00:01.225) 0:08:40.367 ******* 2026-01-06 00:56:58.415407 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415413 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415417 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415420 | orchestrator | 2026-01-06 00:56:58.415429 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-01-06 00:56:58.415432 | orchestrator | Tuesday 06 January 2026 00:53:57 +0000 (0:00:00.432) 0:08:40.800 ******* 2026-01-06 00:56:58.415436 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.415440 | orchestrator | 2026-01-06 00:56:58.415444 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-01-06 00:56:58.415448 | orchestrator | Tuesday 06 January 2026 00:53:58 +0000 (0:00:00.596) 0:08:41.396 ******* 2026-01-06 00:56:58.415451 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.415455 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.415459 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.415463 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415466 | orchestrator | 2026-01-06 00:56:58.415470 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-01-06 00:56:58.415474 | orchestrator | Tuesday 06 January 2026 00:53:59 +0000 (0:00:01.216) 0:08:42.612 ******* 2026-01-06 00:56:58.415478 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415481 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415489 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415493 | orchestrator | 2026-01-06 00:56:58.415497 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-01-06 00:56:58.415500 | orchestrator | Tuesday 06 January 2026 00:54:00 +0000 (0:00:00.343) 0:08:42.956 ******* 2026-01-06 00:56:58.415504 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415508 | orchestrator | 2026-01-06 00:56:58.415511 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-01-06 00:56:58.415515 | orchestrator | Tuesday 06 January 2026 00:54:00 +0000 (0:00:00.238) 0:08:43.194 ******* 2026-01-06 00:56:58.415519 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415545 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415551 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415554 | orchestrator | 2026-01-06 00:56:58.415558 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-01-06 00:56:58.415562 | orchestrator | Tuesday 06 January 2026 00:54:00 +0000 (0:00:00.388) 0:08:43.582 ******* 2026-01-06 00:56:58.415565 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415569 | orchestrator | 2026-01-06 00:56:58.415573 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-01-06 00:56:58.415576 | orchestrator | Tuesday 06 January 2026 00:54:01 +0000 (0:00:00.236) 0:08:43.818 ******* 2026-01-06 00:56:58.415580 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415584 | orchestrator | 2026-01-06 00:56:58.415588 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-01-06 00:56:58.415592 | orchestrator | Tuesday 06 January 2026 00:54:01 +0000 (0:00:00.246) 0:08:44.065 ******* 2026-01-06 00:56:58.415596 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415599 | orchestrator | 2026-01-06 00:56:58.415603 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-01-06 00:56:58.415607 | orchestrator | Tuesday 06 January 2026 00:54:01 +0000 (0:00:00.139) 0:08:44.205 ******* 2026-01-06 00:56:58.415610 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415614 | orchestrator | 2026-01-06 00:56:58.415622 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-01-06 00:56:58.415626 | orchestrator | Tuesday 06 January 2026 00:54:01 +0000 (0:00:00.211) 0:08:44.416 ******* 2026-01-06 00:56:58.415629 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415633 | orchestrator | 2026-01-06 00:56:58.415637 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-01-06 00:56:58.415641 | orchestrator | Tuesday 06 January 2026 00:54:02 +0000 (0:00:01.023) 0:08:45.440 ******* 2026-01-06 00:56:58.415644 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.415652 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.415656 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.415660 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415663 | orchestrator | 2026-01-06 00:56:58.415667 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-01-06 00:56:58.415671 | orchestrator | Tuesday 06 January 2026 00:54:03 +0000 (0:00:00.481) 0:08:45.921 ******* 2026-01-06 00:56:58.415675 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415678 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415682 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415686 | orchestrator | 2026-01-06 00:56:58.415690 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-01-06 00:56:58.415693 | orchestrator | Tuesday 06 January 2026 00:54:03 +0000 (0:00:00.345) 0:08:46.267 ******* 2026-01-06 00:56:58.415697 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415701 | orchestrator | 2026-01-06 00:56:58.415704 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-01-06 00:56:58.415708 | orchestrator | Tuesday 06 January 2026 00:54:03 +0000 (0:00:00.232) 0:08:46.500 ******* 2026-01-06 00:56:58.415712 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415716 | orchestrator | 2026-01-06 00:56:58.415720 | orchestrator | PLAY [Apply role ceph-crash] *************************************************** 2026-01-06 00:56:58.415723 | orchestrator | 2026-01-06 00:56:58.415727 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.415731 | orchestrator | Tuesday 06 January 2026 00:54:04 +0000 (0:00:01.076) 0:08:47.576 ******* 2026-01-06 00:56:58.415735 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.415740 | orchestrator | 2026-01-06 00:56:58.415744 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.415748 | orchestrator | Tuesday 06 January 2026 00:54:06 +0000 (0:00:01.306) 0:08:48.883 ******* 2026-01-06 00:56:58.415752 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.415756 | orchestrator | 2026-01-06 00:56:58.415760 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.415763 | orchestrator | Tuesday 06 January 2026 00:54:07 +0000 (0:00:01.133) 0:08:50.017 ******* 2026-01-06 00:56:58.415767 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415771 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415775 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415778 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.415782 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.415786 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.415791 | orchestrator | 2026-01-06 00:56:58.415797 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.415802 | orchestrator | Tuesday 06 January 2026 00:54:08 +0000 (0:00:01.394) 0:08:51.412 ******* 2026-01-06 00:56:58.415808 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.415813 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.415819 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.415826 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.415831 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.415837 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.415843 | orchestrator | 2026-01-06 00:56:58.415853 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.415860 | orchestrator | Tuesday 06 January 2026 00:54:09 +0000 (0:00:00.751) 0:08:52.163 ******* 2026-01-06 00:56:58.415865 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.415872 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.415878 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.415890 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.415896 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.415901 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.415907 | orchestrator | 2026-01-06 00:56:58.415913 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.415919 | orchestrator | Tuesday 06 January 2026 00:54:10 +0000 (0:00:01.115) 0:08:53.279 ******* 2026-01-06 00:56:58.415925 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.415931 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.415936 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.415942 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.415948 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.415953 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.415959 | orchestrator | 2026-01-06 00:56:58.415965 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.415971 | orchestrator | Tuesday 06 January 2026 00:54:11 +0000 (0:00:00.770) 0:08:54.050 ******* 2026-01-06 00:56:58.415977 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.415983 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.415989 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.415995 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416000 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416006 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416011 | orchestrator | 2026-01-06 00:56:58.416017 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.416023 | orchestrator | Tuesday 06 January 2026 00:54:12 +0000 (0:00:01.368) 0:08:55.419 ******* 2026-01-06 00:56:58.416029 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416079 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416098 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416104 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416109 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416116 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416121 | orchestrator | 2026-01-06 00:56:58.416127 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.416132 | orchestrator | Tuesday 06 January 2026 00:54:13 +0000 (0:00:00.643) 0:08:56.063 ******* 2026-01-06 00:56:58.416138 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416143 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416150 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416155 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416160 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416165 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416170 | orchestrator | 2026-01-06 00:56:58.416175 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.416181 | orchestrator | Tuesday 06 January 2026 00:54:14 +0000 (0:00:01.067) 0:08:57.130 ******* 2026-01-06 00:56:58.416187 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416192 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416198 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416204 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416210 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416216 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416222 | orchestrator | 2026-01-06 00:56:58.416228 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.416234 | orchestrator | Tuesday 06 January 2026 00:54:15 +0000 (0:00:01.115) 0:08:58.246 ******* 2026-01-06 00:56:58.416239 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416246 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416252 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416257 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416263 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416269 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416275 | orchestrator | 2026-01-06 00:56:58.416282 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.416294 | orchestrator | Tuesday 06 January 2026 00:54:16 +0000 (0:00:01.472) 0:08:59.718 ******* 2026-01-06 00:56:58.416300 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416306 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416312 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416318 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416323 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416330 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416335 | orchestrator | 2026-01-06 00:56:58.416341 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.416347 | orchestrator | Tuesday 06 January 2026 00:54:17 +0000 (0:00:00.683) 0:09:00.401 ******* 2026-01-06 00:56:58.416353 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416359 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416365 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416371 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416377 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416383 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416389 | orchestrator | 2026-01-06 00:56:58.416395 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.416400 | orchestrator | Tuesday 06 January 2026 00:54:18 +0000 (0:00:00.997) 0:09:01.399 ******* 2026-01-06 00:56:58.416405 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416410 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416416 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416422 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416429 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416435 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416441 | orchestrator | 2026-01-06 00:56:58.416447 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.416452 | orchestrator | Tuesday 06 January 2026 00:54:19 +0000 (0:00:00.577) 0:09:01.977 ******* 2026-01-06 00:56:58.416458 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416464 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416470 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416475 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416481 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416492 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416498 | orchestrator | 2026-01-06 00:56:58.416504 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.416510 | orchestrator | Tuesday 06 January 2026 00:54:20 +0000 (0:00:00.937) 0:09:02.914 ******* 2026-01-06 00:56:58.416517 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416537 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416543 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416549 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416555 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416561 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416567 | orchestrator | 2026-01-06 00:56:58.416575 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.416579 | orchestrator | Tuesday 06 January 2026 00:54:20 +0000 (0:00:00.633) 0:09:03.547 ******* 2026-01-06 00:56:58.416582 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416586 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416590 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416594 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416597 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416601 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416605 | orchestrator | 2026-01-06 00:56:58.416608 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.416612 | orchestrator | Tuesday 06 January 2026 00:54:21 +0000 (0:00:00.879) 0:09:04.427 ******* 2026-01-06 00:56:58.416616 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416624 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416628 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416632 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:56:58.416635 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:56:58.416639 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:56:58.416643 | orchestrator | 2026-01-06 00:56:58.416646 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.416650 | orchestrator | Tuesday 06 January 2026 00:54:22 +0000 (0:00:00.643) 0:09:05.070 ******* 2026-01-06 00:56:58.416654 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.416662 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.416666 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.416670 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416673 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416677 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416681 | orchestrator | 2026-01-06 00:56:58.416685 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.416689 | orchestrator | Tuesday 06 January 2026 00:54:23 +0000 (0:00:00.959) 0:09:06.030 ******* 2026-01-06 00:56:58.416692 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416696 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416700 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416704 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416707 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416711 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416715 | orchestrator | 2026-01-06 00:56:58.416719 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.416723 | orchestrator | Tuesday 06 January 2026 00:54:23 +0000 (0:00:00.682) 0:09:06.713 ******* 2026-01-06 00:56:58.416726 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416730 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416734 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416737 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416741 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416745 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416749 | orchestrator | 2026-01-06 00:56:58.416752 | orchestrator | TASK [ceph-crash : Create client.crash keyring] ******************************** 2026-01-06 00:56:58.416756 | orchestrator | Tuesday 06 January 2026 00:54:25 +0000 (0:00:01.464) 0:09:08.177 ******* 2026-01-06 00:56:58.416760 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.416764 | orchestrator | 2026-01-06 00:56:58.416768 | orchestrator | TASK [ceph-crash : Get keys from monitors] ************************************* 2026-01-06 00:56:58.416771 | orchestrator | Tuesday 06 January 2026 00:54:29 +0000 (0:00:04.389) 0:09:12.567 ******* 2026-01-06 00:56:58.416775 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.416779 | orchestrator | 2026-01-06 00:56:58.416783 | orchestrator | TASK [ceph-crash : Copy ceph key(s) if needed] ********************************* 2026-01-06 00:56:58.416786 | orchestrator | Tuesday 06 January 2026 00:54:31 +0000 (0:00:02.116) 0:09:14.684 ******* 2026-01-06 00:56:58.416790 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.416794 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.416798 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416801 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.416805 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.416809 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.416813 | orchestrator | 2026-01-06 00:56:58.416817 | orchestrator | TASK [ceph-crash : Create /var/lib/ceph/crash/posted] ************************** 2026-01-06 00:56:58.416820 | orchestrator | Tuesday 06 January 2026 00:54:33 +0000 (0:00:02.031) 0:09:16.716 ******* 2026-01-06 00:56:58.416824 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.416828 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.416832 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.416836 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.416839 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.416848 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.416851 | orchestrator | 2026-01-06 00:56:58.416855 | orchestrator | TASK [ceph-crash : Include_tasks systemd.yml] ********************************** 2026-01-06 00:56:58.416859 | orchestrator | Tuesday 06 January 2026 00:54:34 +0000 (0:00:01.089) 0:09:17.805 ******* 2026-01-06 00:56:58.416863 | orchestrator | included: /ansible/roles/ceph-crash/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.416869 | orchestrator | 2026-01-06 00:56:58.416872 | orchestrator | TASK [ceph-crash : Generate systemd unit file for ceph-crash container] ******** 2026-01-06 00:56:58.416876 | orchestrator | Tuesday 06 January 2026 00:54:36 +0000 (0:00:01.525) 0:09:19.331 ******* 2026-01-06 00:56:58.416880 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.416884 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.416888 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.416897 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.416901 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.416905 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.416909 | orchestrator | 2026-01-06 00:56:58.416913 | orchestrator | TASK [ceph-crash : Start the ceph-crash service] ******************************* 2026-01-06 00:56:58.416917 | orchestrator | Tuesday 06 January 2026 00:54:38 +0000 (0:00:01.892) 0:09:21.224 ******* 2026-01-06 00:56:58.416920 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.416924 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.416928 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.416931 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.416935 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.416939 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.416943 | orchestrator | 2026-01-06 00:56:58.416946 | orchestrator | RUNNING HANDLER [ceph-handler : Ceph crash handler] **************************** 2026-01-06 00:56:58.416950 | orchestrator | Tuesday 06 January 2026 00:54:41 +0000 (0:00:03.342) 0:09:24.566 ******* 2026-01-06 00:56:58.416954 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_crash.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:56:58.416958 | orchestrator | 2026-01-06 00:56:58.416962 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called before restart] ****** 2026-01-06 00:56:58.416966 | orchestrator | Tuesday 06 January 2026 00:54:43 +0000 (0:00:01.306) 0:09:25.872 ******* 2026-01-06 00:56:58.416970 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.416973 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.416977 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.416981 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.416985 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.416988 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.416992 | orchestrator | 2026-01-06 00:56:58.416996 | orchestrator | RUNNING HANDLER [ceph-handler : Restart the ceph-crash service] **************** 2026-01-06 00:56:58.417000 | orchestrator | Tuesday 06 January 2026 00:54:43 +0000 (0:00:00.727) 0:09:26.600 ******* 2026-01-06 00:56:58.417004 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.417011 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.417014 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:56:58.417018 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:56:58.417022 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:56:58.417026 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.417029 | orchestrator | 2026-01-06 00:56:58.417033 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called after restart] ******* 2026-01-06 00:56:58.417037 | orchestrator | Tuesday 06 January 2026 00:54:46 +0000 (0:00:02.640) 0:09:29.240 ******* 2026-01-06 00:56:58.417041 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417045 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417049 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417052 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:56:58.417056 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:56:58.417067 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:56:58.417071 | orchestrator | 2026-01-06 00:56:58.417075 | orchestrator | PLAY [Apply role ceph-mds] ***************************************************** 2026-01-06 00:56:58.417079 | orchestrator | 2026-01-06 00:56:58.417083 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.417086 | orchestrator | Tuesday 06 January 2026 00:54:47 +0000 (0:00:01.127) 0:09:30.368 ******* 2026-01-06 00:56:58.417090 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.417094 | orchestrator | 2026-01-06 00:56:58.417098 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.417102 | orchestrator | Tuesday 06 January 2026 00:54:48 +0000 (0:00:00.675) 0:09:31.043 ******* 2026-01-06 00:56:58.417106 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.417109 | orchestrator | 2026-01-06 00:56:58.417113 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.417117 | orchestrator | Tuesday 06 January 2026 00:54:49 +0000 (0:00:01.090) 0:09:32.134 ******* 2026-01-06 00:56:58.417121 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417124 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417128 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417132 | orchestrator | 2026-01-06 00:56:58.417136 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.417140 | orchestrator | Tuesday 06 January 2026 00:54:49 +0000 (0:00:00.313) 0:09:32.448 ******* 2026-01-06 00:56:58.417143 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417147 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417151 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417155 | orchestrator | 2026-01-06 00:56:58.417158 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.417162 | orchestrator | Tuesday 06 January 2026 00:54:50 +0000 (0:00:00.787) 0:09:33.235 ******* 2026-01-06 00:56:58.417166 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417170 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417174 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417177 | orchestrator | 2026-01-06 00:56:58.417181 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.417185 | orchestrator | Tuesday 06 January 2026 00:54:51 +0000 (0:00:01.143) 0:09:34.379 ******* 2026-01-06 00:56:58.417189 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417192 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417196 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417200 | orchestrator | 2026-01-06 00:56:58.417204 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.417207 | orchestrator | Tuesday 06 January 2026 00:54:52 +0000 (0:00:00.814) 0:09:35.194 ******* 2026-01-06 00:56:58.417211 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417215 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417219 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417223 | orchestrator | 2026-01-06 00:56:58.417226 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.417230 | orchestrator | Tuesday 06 January 2026 00:54:52 +0000 (0:00:00.347) 0:09:35.541 ******* 2026-01-06 00:56:58.417237 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417241 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417245 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417249 | orchestrator | 2026-01-06 00:56:58.417253 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.417256 | orchestrator | Tuesday 06 January 2026 00:54:53 +0000 (0:00:00.341) 0:09:35.883 ******* 2026-01-06 00:56:58.417260 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417264 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417268 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417275 | orchestrator | 2026-01-06 00:56:58.417279 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.417283 | orchestrator | Tuesday 06 January 2026 00:54:53 +0000 (0:00:00.637) 0:09:36.520 ******* 2026-01-06 00:56:58.417287 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417291 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417294 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417298 | orchestrator | 2026-01-06 00:56:58.417302 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.417305 | orchestrator | Tuesday 06 January 2026 00:54:54 +0000 (0:00:00.797) 0:09:37.317 ******* 2026-01-06 00:56:58.417309 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417313 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417317 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417321 | orchestrator | 2026-01-06 00:56:58.417324 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.417328 | orchestrator | Tuesday 06 January 2026 00:54:55 +0000 (0:00:00.813) 0:09:38.131 ******* 2026-01-06 00:56:58.417332 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417336 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417339 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417343 | orchestrator | 2026-01-06 00:56:58.417347 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.417351 | orchestrator | Tuesday 06 January 2026 00:54:55 +0000 (0:00:00.321) 0:09:38.452 ******* 2026-01-06 00:56:58.417354 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417361 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417366 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417371 | orchestrator | 2026-01-06 00:56:58.417378 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.417383 | orchestrator | Tuesday 06 January 2026 00:54:56 +0000 (0:00:00.638) 0:09:39.091 ******* 2026-01-06 00:56:58.417388 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417394 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417400 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417405 | orchestrator | 2026-01-06 00:56:58.417412 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.417418 | orchestrator | Tuesday 06 January 2026 00:54:56 +0000 (0:00:00.357) 0:09:39.448 ******* 2026-01-06 00:56:58.417424 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417430 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417436 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417442 | orchestrator | 2026-01-06 00:56:58.417448 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.417453 | orchestrator | Tuesday 06 January 2026 00:54:57 +0000 (0:00:00.367) 0:09:39.816 ******* 2026-01-06 00:56:58.417459 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417465 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417471 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417477 | orchestrator | 2026-01-06 00:56:58.417483 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.417489 | orchestrator | Tuesday 06 January 2026 00:54:57 +0000 (0:00:00.344) 0:09:40.161 ******* 2026-01-06 00:56:58.417496 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417502 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417508 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417513 | orchestrator | 2026-01-06 00:56:58.417518 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.417539 | orchestrator | Tuesday 06 January 2026 00:54:58 +0000 (0:00:00.706) 0:09:40.867 ******* 2026-01-06 00:56:58.417545 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417551 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417557 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417562 | orchestrator | 2026-01-06 00:56:58.417568 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.417581 | orchestrator | Tuesday 06 January 2026 00:54:58 +0000 (0:00:00.356) 0:09:41.223 ******* 2026-01-06 00:56:58.417588 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417594 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417600 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417605 | orchestrator | 2026-01-06 00:56:58.417613 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.417618 | orchestrator | Tuesday 06 January 2026 00:54:58 +0000 (0:00:00.330) 0:09:41.554 ******* 2026-01-06 00:56:58.417622 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417625 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417629 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417633 | orchestrator | 2026-01-06 00:56:58.417637 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.417640 | orchestrator | Tuesday 06 January 2026 00:54:59 +0000 (0:00:00.373) 0:09:41.927 ******* 2026-01-06 00:56:58.417644 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.417648 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.417652 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.417656 | orchestrator | 2026-01-06 00:56:58.417659 | orchestrator | TASK [ceph-mds : Include create_mds_filesystems.yml] *************************** 2026-01-06 00:56:58.417663 | orchestrator | Tuesday 06 January 2026 00:55:00 +0000 (0:00:00.886) 0:09:42.814 ******* 2026-01-06 00:56:58.417667 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417671 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417674 | orchestrator | included: /ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml for testbed-node-3 2026-01-06 00:56:58.417678 | orchestrator | 2026-01-06 00:56:58.417682 | orchestrator | TASK [ceph-facts : Get current default crush rule details] ********************* 2026-01-06 00:56:58.417685 | orchestrator | Tuesday 06 January 2026 00:55:00 +0000 (0:00:00.546) 0:09:43.360 ******* 2026-01-06 00:56:58.417693 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.417696 | orchestrator | 2026-01-06 00:56:58.417700 | orchestrator | TASK [ceph-facts : Get current default crush rule name] ************************ 2026-01-06 00:56:58.417704 | orchestrator | Tuesday 06 January 2026 00:55:02 +0000 (0:00:02.309) 0:09:45.670 ******* 2026-01-06 00:56:58.417709 | orchestrator | skipping: [testbed-node-3] => (item={'rule_id': 0, 'rule_name': 'replicated_rule', 'type': 1, 'steps': [{'op': 'take', 'item': -1, 'item_name': 'default'}, {'op': 'chooseleaf_firstn', 'num': 0, 'type': 'host'}, {'op': 'emit'}]})  2026-01-06 00:56:58.417716 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417719 | orchestrator | 2026-01-06 00:56:58.417723 | orchestrator | TASK [ceph-mds : Create filesystem pools] ************************************** 2026-01-06 00:56:58.417727 | orchestrator | Tuesday 06 January 2026 00:55:03 +0000 (0:00:00.216) 0:09:45.886 ******* 2026-01-06 00:56:58.417732 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_data', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:56:58.417743 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_metadata', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:56:58.417748 | orchestrator | 2026-01-06 00:56:58.417754 | orchestrator | TASK [ceph-mds : Create ceph filesystem] *************************************** 2026-01-06 00:56:58.417760 | orchestrator | Tuesday 06 January 2026 00:55:12 +0000 (0:00:09.280) 0:09:55.167 ******* 2026-01-06 00:56:58.417771 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-01-06 00:56:58.417777 | orchestrator | 2026-01-06 00:56:58.417783 | orchestrator | TASK [ceph-mds : Include common.yml] ******************************************* 2026-01-06 00:56:58.417789 | orchestrator | Tuesday 06 January 2026 00:55:15 +0000 (0:00:03.541) 0:09:58.708 ******* 2026-01-06 00:56:58.417800 | orchestrator | included: /ansible/roles/ceph-mds/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.417806 | orchestrator | 2026-01-06 00:56:58.417813 | orchestrator | TASK [ceph-mds : Create bootstrap-mds and mds directories] ********************* 2026-01-06 00:56:58.417819 | orchestrator | Tuesday 06 January 2026 00:55:16 +0000 (0:00:00.522) 0:09:59.230 ******* 2026-01-06 00:56:58.417825 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds/) 2026-01-06 00:56:58.417831 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds/) 2026-01-06 00:56:58.417837 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds/) 2026-01-06 00:56:58.417844 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds/ceph-testbed-node-3) 2026-01-06 00:56:58.417849 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds/ceph-testbed-node-4) 2026-01-06 00:56:58.417855 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds/ceph-testbed-node-5) 2026-01-06 00:56:58.417861 | orchestrator | 2026-01-06 00:56:58.417867 | orchestrator | TASK [ceph-mds : Get keys from monitors] *************************************** 2026-01-06 00:56:58.417873 | orchestrator | Tuesday 06 January 2026 00:55:17 +0000 (0:00:01.201) 0:10:00.431 ******* 2026-01-06 00:56:58.417878 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.417885 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.417891 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.417897 | orchestrator | 2026-01-06 00:56:58.417903 | orchestrator | TASK [ceph-mds : Copy ceph key(s) if needed] *********************************** 2026-01-06 00:56:58.417909 | orchestrator | Tuesday 06 January 2026 00:55:20 +0000 (0:00:02.556) 0:10:02.988 ******* 2026-01-06 00:56:58.417917 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-01-06 00:56:58.417921 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.417925 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.417928 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-01-06 00:56:58.417932 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-01-06 00:56:58.417936 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.417939 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-01-06 00:56:58.417943 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-01-06 00:56:58.417947 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.417951 | orchestrator | 2026-01-06 00:56:58.417954 | orchestrator | TASK [ceph-mds : Create mds keyring] ******************************************* 2026-01-06 00:56:58.417958 | orchestrator | Tuesday 06 January 2026 00:55:21 +0000 (0:00:01.818) 0:10:04.807 ******* 2026-01-06 00:56:58.417962 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.417965 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.417969 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.417973 | orchestrator | 2026-01-06 00:56:58.417977 | orchestrator | TASK [ceph-mds : Non_containerized.yml] **************************************** 2026-01-06 00:56:58.417980 | orchestrator | Tuesday 06 January 2026 00:55:24 +0000 (0:00:02.940) 0:10:07.748 ******* 2026-01-06 00:56:58.417984 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.417988 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.417992 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.417996 | orchestrator | 2026-01-06 00:56:58.417999 | orchestrator | TASK [ceph-mds : Containerized.yml] ******************************************** 2026-01-06 00:56:58.418003 | orchestrator | Tuesday 06 January 2026 00:55:25 +0000 (0:00:00.447) 0:10:08.195 ******* 2026-01-06 00:56:58.418011 | orchestrator | included: /ansible/roles/ceph-mds/tasks/containerized.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.418066 | orchestrator | 2026-01-06 00:56:58.418070 | orchestrator | TASK [ceph-mds : Include_tasks systemd.yml] ************************************ 2026-01-06 00:56:58.418074 | orchestrator | Tuesday 06 January 2026 00:55:26 +0000 (0:00:00.979) 0:10:09.175 ******* 2026-01-06 00:56:58.418083 | orchestrator | included: /ansible/roles/ceph-mds/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.418087 | orchestrator | 2026-01-06 00:56:58.418091 | orchestrator | TASK [ceph-mds : Generate systemd unit file] *********************************** 2026-01-06 00:56:58.418094 | orchestrator | Tuesday 06 January 2026 00:55:26 +0000 (0:00:00.544) 0:10:09.719 ******* 2026-01-06 00:56:58.418098 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418102 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418106 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418109 | orchestrator | 2026-01-06 00:56:58.418113 | orchestrator | TASK [ceph-mds : Generate systemd ceph-mds target file] ************************ 2026-01-06 00:56:58.418117 | orchestrator | Tuesday 06 January 2026 00:55:28 +0000 (0:00:01.354) 0:10:11.074 ******* 2026-01-06 00:56:58.418120 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418124 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418128 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418132 | orchestrator | 2026-01-06 00:56:58.418135 | orchestrator | TASK [ceph-mds : Enable ceph-mds.target] *************************************** 2026-01-06 00:56:58.418139 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:01.784) 0:10:12.859 ******* 2026-01-06 00:56:58.418143 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418146 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418150 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418154 | orchestrator | 2026-01-06 00:56:58.418158 | orchestrator | TASK [ceph-mds : Systemd start mds container] ********************************** 2026-01-06 00:56:58.418161 | orchestrator | Tuesday 06 January 2026 00:55:32 +0000 (0:00:01.976) 0:10:14.835 ******* 2026-01-06 00:56:58.418165 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418175 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418179 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418182 | orchestrator | 2026-01-06 00:56:58.418186 | orchestrator | TASK [ceph-mds : Wait for mds socket to exist] ********************************* 2026-01-06 00:56:58.418190 | orchestrator | Tuesday 06 January 2026 00:55:34 +0000 (0:00:02.159) 0:10:16.995 ******* 2026-01-06 00:56:58.418194 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418197 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418201 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418205 | orchestrator | 2026-01-06 00:56:58.418209 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.418212 | orchestrator | Tuesday 06 January 2026 00:55:36 +0000 (0:00:01.916) 0:10:18.912 ******* 2026-01-06 00:56:58.418216 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418220 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418224 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418228 | orchestrator | 2026-01-06 00:56:58.418231 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-01-06 00:56:58.418235 | orchestrator | Tuesday 06 January 2026 00:55:37 +0000 (0:00:01.007) 0:10:19.920 ******* 2026-01-06 00:56:58.418239 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.418243 | orchestrator | 2026-01-06 00:56:58.418246 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-01-06 00:56:58.418250 | orchestrator | Tuesday 06 January 2026 00:55:37 +0000 (0:00:00.685) 0:10:20.606 ******* 2026-01-06 00:56:58.418254 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418258 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418262 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418265 | orchestrator | 2026-01-06 00:56:58.418269 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-01-06 00:56:58.418273 | orchestrator | Tuesday 06 January 2026 00:55:38 +0000 (0:00:00.367) 0:10:20.973 ******* 2026-01-06 00:56:58.418277 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.418280 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.418284 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.418293 | orchestrator | 2026-01-06 00:56:58.418297 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-01-06 00:56:58.418300 | orchestrator | Tuesday 06 January 2026 00:55:39 +0000 (0:00:01.341) 0:10:22.314 ******* 2026-01-06 00:56:58.418304 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.418308 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.418312 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.418316 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418320 | orchestrator | 2026-01-06 00:56:58.418324 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-01-06 00:56:58.418328 | orchestrator | Tuesday 06 January 2026 00:55:40 +0000 (0:00:01.207) 0:10:23.522 ******* 2026-01-06 00:56:58.418331 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418335 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418339 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418343 | orchestrator | 2026-01-06 00:56:58.418347 | orchestrator | PLAY [Apply role ceph-rgw] ***************************************************** 2026-01-06 00:56:58.418350 | orchestrator | 2026-01-06 00:56:58.418354 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-01-06 00:56:58.418358 | orchestrator | Tuesday 06 January 2026 00:55:41 +0000 (0:00:01.073) 0:10:24.595 ******* 2026-01-06 00:56:58.418362 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.418366 | orchestrator | 2026-01-06 00:56:58.418369 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-01-06 00:56:58.418373 | orchestrator | Tuesday 06 January 2026 00:55:42 +0000 (0:00:00.590) 0:10:25.185 ******* 2026-01-06 00:56:58.418377 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.418381 | orchestrator | 2026-01-06 00:56:58.418388 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-01-06 00:56:58.418392 | orchestrator | Tuesday 06 January 2026 00:55:43 +0000 (0:00:01.388) 0:10:26.574 ******* 2026-01-06 00:56:58.418396 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418400 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418403 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418407 | orchestrator | 2026-01-06 00:56:58.418411 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-01-06 00:56:58.418415 | orchestrator | Tuesday 06 January 2026 00:55:44 +0000 (0:00:00.832) 0:10:27.407 ******* 2026-01-06 00:56:58.418418 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418422 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418426 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418430 | orchestrator | 2026-01-06 00:56:58.418434 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-01-06 00:56:58.418438 | orchestrator | Tuesday 06 January 2026 00:55:45 +0000 (0:00:00.825) 0:10:28.233 ******* 2026-01-06 00:56:58.418442 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418445 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418449 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418453 | orchestrator | 2026-01-06 00:56:58.418457 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-01-06 00:56:58.418460 | orchestrator | Tuesday 06 January 2026 00:55:46 +0000 (0:00:01.218) 0:10:29.451 ******* 2026-01-06 00:56:58.418466 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418472 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418478 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418484 | orchestrator | 2026-01-06 00:56:58.418490 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-01-06 00:56:58.418496 | orchestrator | Tuesday 06 January 2026 00:55:47 +0000 (0:00:00.856) 0:10:30.307 ******* 2026-01-06 00:56:58.418503 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418510 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418650 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418675 | orchestrator | 2026-01-06 00:56:58.418688 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-01-06 00:56:58.418692 | orchestrator | Tuesday 06 January 2026 00:55:47 +0000 (0:00:00.424) 0:10:30.732 ******* 2026-01-06 00:56:58.418696 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418700 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418704 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418708 | orchestrator | 2026-01-06 00:56:58.418711 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-01-06 00:56:58.418715 | orchestrator | Tuesday 06 January 2026 00:55:48 +0000 (0:00:00.421) 0:10:31.153 ******* 2026-01-06 00:56:58.418719 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418723 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418727 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418731 | orchestrator | 2026-01-06 00:56:58.418734 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-01-06 00:56:58.418738 | orchestrator | Tuesday 06 January 2026 00:55:49 +0000 (0:00:00.792) 0:10:31.946 ******* 2026-01-06 00:56:58.418742 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418746 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418749 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418753 | orchestrator | 2026-01-06 00:56:58.418757 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-01-06 00:56:58.418761 | orchestrator | Tuesday 06 January 2026 00:55:49 +0000 (0:00:00.839) 0:10:32.786 ******* 2026-01-06 00:56:58.418765 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418768 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418772 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418776 | orchestrator | 2026-01-06 00:56:58.418780 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-01-06 00:56:58.418783 | orchestrator | Tuesday 06 January 2026 00:55:50 +0000 (0:00:00.749) 0:10:33.535 ******* 2026-01-06 00:56:58.418787 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418791 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418795 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418798 | orchestrator | 2026-01-06 00:56:58.418802 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-01-06 00:56:58.418806 | orchestrator | Tuesday 06 January 2026 00:55:51 +0000 (0:00:00.343) 0:10:33.878 ******* 2026-01-06 00:56:58.418810 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418813 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418817 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418821 | orchestrator | 2026-01-06 00:56:58.418825 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-01-06 00:56:58.418828 | orchestrator | Tuesday 06 January 2026 00:55:51 +0000 (0:00:00.682) 0:10:34.560 ******* 2026-01-06 00:56:58.418832 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418836 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418840 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418844 | orchestrator | 2026-01-06 00:56:58.418848 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-01-06 00:56:58.418851 | orchestrator | Tuesday 06 January 2026 00:55:52 +0000 (0:00:00.358) 0:10:34.919 ******* 2026-01-06 00:56:58.418855 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418859 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418863 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418866 | orchestrator | 2026-01-06 00:56:58.418870 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-01-06 00:56:58.418874 | orchestrator | Tuesday 06 January 2026 00:55:52 +0000 (0:00:00.364) 0:10:35.284 ******* 2026-01-06 00:56:58.418878 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418881 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418885 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418889 | orchestrator | 2026-01-06 00:56:58.418899 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-01-06 00:56:58.418903 | orchestrator | Tuesday 06 January 2026 00:55:52 +0000 (0:00:00.351) 0:10:35.636 ******* 2026-01-06 00:56:58.418907 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418911 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418915 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418918 | orchestrator | 2026-01-06 00:56:58.418922 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-01-06 00:56:58.418931 | orchestrator | Tuesday 06 January 2026 00:55:53 +0000 (0:00:00.667) 0:10:36.303 ******* 2026-01-06 00:56:58.418935 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418939 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418943 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418946 | orchestrator | 2026-01-06 00:56:58.418950 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-01-06 00:56:58.418954 | orchestrator | Tuesday 06 January 2026 00:55:53 +0000 (0:00:00.327) 0:10:36.630 ******* 2026-01-06 00:56:58.418958 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.418961 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.418965 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.418969 | orchestrator | 2026-01-06 00:56:58.418972 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-01-06 00:56:58.418976 | orchestrator | Tuesday 06 January 2026 00:55:54 +0000 (0:00:00.508) 0:10:37.138 ******* 2026-01-06 00:56:58.418980 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.418984 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.418988 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.418991 | orchestrator | 2026-01-06 00:56:58.418995 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-01-06 00:56:58.418999 | orchestrator | Tuesday 06 January 2026 00:55:54 +0000 (0:00:00.369) 0:10:37.508 ******* 2026-01-06 00:56:58.419003 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.419006 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.419010 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.419014 | orchestrator | 2026-01-06 00:56:58.419018 | orchestrator | TASK [ceph-rgw : Include common.yml] ******************************************* 2026-01-06 00:56:58.419021 | orchestrator | Tuesday 06 January 2026 00:55:55 +0000 (0:00:00.904) 0:10:38.412 ******* 2026-01-06 00:56:58.419025 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.419029 | orchestrator | 2026-01-06 00:56:58.419033 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-01-06 00:56:58.419040 | orchestrator | Tuesday 06 January 2026 00:55:56 +0000 (0:00:00.548) 0:10:38.961 ******* 2026-01-06 00:56:58.419044 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419048 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.419052 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.419056 | orchestrator | 2026-01-06 00:56:58.419059 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-01-06 00:56:58.419063 | orchestrator | Tuesday 06 January 2026 00:55:58 +0000 (0:00:01.947) 0:10:40.909 ******* 2026-01-06 00:56:58.419067 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-01-06 00:56:58.419071 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-01-06 00:56:58.419074 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.419078 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-01-06 00:56:58.419082 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-01-06 00:56:58.419086 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.419090 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-01-06 00:56:58.419093 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-01-06 00:56:58.419097 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.419101 | orchestrator | 2026-01-06 00:56:58.419109 | orchestrator | TASK [ceph-rgw : Copy SSL certificate & key data to certificate path] ********** 2026-01-06 00:56:58.419113 | orchestrator | Tuesday 06 January 2026 00:55:59 +0000 (0:00:01.492) 0:10:42.401 ******* 2026-01-06 00:56:58.419117 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419121 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.419124 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.419128 | orchestrator | 2026-01-06 00:56:58.419132 | orchestrator | TASK [ceph-rgw : Include_tasks pre_requisite.yml] ****************************** 2026-01-06 00:56:58.419136 | orchestrator | Tuesday 06 January 2026 00:55:59 +0000 (0:00:00.331) 0:10:42.732 ******* 2026-01-06 00:56:58.419140 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/pre_requisite.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.419143 | orchestrator | 2026-01-06 00:56:58.419147 | orchestrator | TASK [ceph-rgw : Create rados gateway directories] ***************************** 2026-01-06 00:56:58.419151 | orchestrator | Tuesday 06 January 2026 00:56:00 +0000 (0:00:00.571) 0:10:43.304 ******* 2026-01-06 00:56:58.419155 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419160 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419164 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419168 | orchestrator | 2026-01-06 00:56:58.419171 | orchestrator | TASK [ceph-rgw : Create rgw keyrings] ****************************************** 2026-01-06 00:56:58.419175 | orchestrator | Tuesday 06 January 2026 00:56:01 +0000 (0:00:01.423) 0:10:44.727 ******* 2026-01-06 00:56:58.419179 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419183 | orchestrator | changed: [testbed-node-3 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-01-06 00:56:58.419187 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419190 | orchestrator | changed: [testbed-node-4 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-01-06 00:56:58.419200 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419204 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-01-06 00:56:58.419208 | orchestrator | 2026-01-06 00:56:58.419211 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-01-06 00:56:58.419215 | orchestrator | Tuesday 06 January 2026 00:56:06 +0000 (0:00:04.264) 0:10:48.992 ******* 2026-01-06 00:56:58.419219 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419223 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.419226 | orchestrator | ok: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419230 | orchestrator | ok: [testbed-node-4 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.419234 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:56:58.419238 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:56:58.419242 | orchestrator | 2026-01-06 00:56:58.419245 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-01-06 00:56:58.419249 | orchestrator | Tuesday 06 January 2026 00:56:08 +0000 (0:00:02.347) 0:10:51.339 ******* 2026-01-06 00:56:58.419253 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-01-06 00:56:58.419257 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.419260 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-01-06 00:56:58.419269 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.419273 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-01-06 00:56:58.419277 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.419281 | orchestrator | 2026-01-06 00:56:58.419285 | orchestrator | TASK [ceph-rgw : Rgw pool creation tasks] ************************************** 2026-01-06 00:56:58.419291 | orchestrator | Tuesday 06 January 2026 00:56:09 +0000 (0:00:01.298) 0:10:52.638 ******* 2026-01-06 00:56:58.419295 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/rgw_create_pools.yml for testbed-node-3 2026-01-06 00:56:58.419299 | orchestrator | 2026-01-06 00:56:58.419303 | orchestrator | TASK [ceph-rgw : Create ec profile] ******************************************** 2026-01-06 00:56:58.419306 | orchestrator | Tuesday 06 January 2026 00:56:10 +0000 (0:00:00.229) 0:10:52.867 ******* 2026-01-06 00:56:58.419310 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419315 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419319 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419323 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419327 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419330 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419334 | orchestrator | 2026-01-06 00:56:58.419338 | orchestrator | TASK [ceph-rgw : Set crush rule] *********************************************** 2026-01-06 00:56:58.419342 | orchestrator | Tuesday 06 January 2026 00:56:11 +0000 (0:00:01.264) 0:10:54.131 ******* 2026-01-06 00:56:58.419346 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419350 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419353 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419357 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419361 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-01-06 00:56:58.419365 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419369 | orchestrator | 2026-01-06 00:56:58.419373 | orchestrator | TASK [ceph-rgw : Create rgw pools] ********************************************* 2026-01-06 00:56:58.419376 | orchestrator | Tuesday 06 January 2026 00:56:11 +0000 (0:00:00.654) 0:10:54.786 ******* 2026-01-06 00:56:58.419380 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-01-06 00:56:58.419384 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-01-06 00:56:58.419388 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-01-06 00:56:58.419395 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-01-06 00:56:58.419399 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-01-06 00:56:58.419406 | orchestrator | 2026-01-06 00:56:58.419410 | orchestrator | TASK [ceph-rgw : Include_tasks openstack-keystone.yml] ************************* 2026-01-06 00:56:58.419414 | orchestrator | Tuesday 06 January 2026 00:56:43 +0000 (0:00:31.540) 0:11:26.327 ******* 2026-01-06 00:56:58.419418 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419422 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.419426 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.419429 | orchestrator | 2026-01-06 00:56:58.419433 | orchestrator | TASK [ceph-rgw : Include_tasks start_radosgw.yml] ****************************** 2026-01-06 00:56:58.419437 | orchestrator | Tuesday 06 January 2026 00:56:43 +0000 (0:00:00.366) 0:11:26.693 ******* 2026-01-06 00:56:58.419441 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419445 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.419448 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.419452 | orchestrator | 2026-01-06 00:56:58.419456 | orchestrator | TASK [ceph-rgw : Include start_docker_rgw.yml] ********************************* 2026-01-06 00:56:58.419460 | orchestrator | Tuesday 06 January 2026 00:56:44 +0000 (0:00:00.346) 0:11:27.040 ******* 2026-01-06 00:56:58.419464 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/start_docker_rgw.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.419467 | orchestrator | 2026-01-06 00:56:58.419471 | orchestrator | TASK [ceph-rgw : Include_task systemd.yml] ************************************* 2026-01-06 00:56:58.419475 | orchestrator | Tuesday 06 January 2026 00:56:45 +0000 (0:00:00.829) 0:11:27.870 ******* 2026-01-06 00:56:58.419479 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.419482 | orchestrator | 2026-01-06 00:56:58.419489 | orchestrator | TASK [ceph-rgw : Generate systemd unit file] *********************************** 2026-01-06 00:56:58.419493 | orchestrator | Tuesday 06 January 2026 00:56:45 +0000 (0:00:00.589) 0:11:28.459 ******* 2026-01-06 00:56:58.419496 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.419500 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.419504 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.419508 | orchestrator | 2026-01-06 00:56:58.419512 | orchestrator | TASK [ceph-rgw : Generate systemd ceph-radosgw target file] ******************** 2026-01-06 00:56:58.419516 | orchestrator | Tuesday 06 January 2026 00:56:46 +0000 (0:00:01.307) 0:11:29.766 ******* 2026-01-06 00:56:58.419520 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.419562 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.419567 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.419571 | orchestrator | 2026-01-06 00:56:58.419575 | orchestrator | TASK [ceph-rgw : Enable ceph-radosgw.target] *********************************** 2026-01-06 00:56:58.419579 | orchestrator | Tuesday 06 January 2026 00:56:48 +0000 (0:00:01.652) 0:11:31.419 ******* 2026-01-06 00:56:58.419583 | orchestrator | changed: [testbed-node-3] 2026-01-06 00:56:58.419586 | orchestrator | changed: [testbed-node-5] 2026-01-06 00:56:58.419590 | orchestrator | changed: [testbed-node-4] 2026-01-06 00:56:58.419594 | orchestrator | 2026-01-06 00:56:58.419598 | orchestrator | TASK [ceph-rgw : Systemd start rgw container] ********************************** 2026-01-06 00:56:58.419602 | orchestrator | Tuesday 06 January 2026 00:56:50 +0000 (0:00:01.934) 0:11:33.353 ******* 2026-01-06 00:56:58.419606 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419609 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419613 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-01-06 00:56:58.419617 | orchestrator | 2026-01-06 00:56:58.419621 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-01-06 00:56:58.419625 | orchestrator | Tuesday 06 January 2026 00:56:53 +0000 (0:00:02.915) 0:11:36.268 ******* 2026-01-06 00:56:58.419634 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419638 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.419641 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.419645 | orchestrator | 2026-01-06 00:56:58.419649 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-01-06 00:56:58.419653 | orchestrator | Tuesday 06 January 2026 00:56:53 +0000 (0:00:00.388) 0:11:36.657 ******* 2026-01-06 00:56:58.419657 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:56:58.419660 | orchestrator | 2026-01-06 00:56:58.419664 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-01-06 00:56:58.419668 | orchestrator | Tuesday 06 January 2026 00:56:54 +0000 (0:00:00.543) 0:11:37.201 ******* 2026-01-06 00:56:58.419672 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.419675 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.419679 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.419683 | orchestrator | 2026-01-06 00:56:58.419687 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-01-06 00:56:58.419691 | orchestrator | Tuesday 06 January 2026 00:56:55 +0000 (0:00:00.688) 0:11:37.889 ******* 2026-01-06 00:56:58.419694 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419698 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:56:58.419702 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:56:58.419706 | orchestrator | 2026-01-06 00:56:58.419709 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-01-06 00:56:58.419713 | orchestrator | Tuesday 06 January 2026 00:56:55 +0000 (0:00:00.375) 0:11:38.265 ******* 2026-01-06 00:56:58.419717 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:56:58.419721 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:56:58.419725 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:56:58.419729 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:56:58.419733 | orchestrator | 2026-01-06 00:56:58.419737 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-01-06 00:56:58.419741 | orchestrator | Tuesday 06 January 2026 00:56:56 +0000 (0:00:00.631) 0:11:38.897 ******* 2026-01-06 00:56:58.419744 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:56:58.419748 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:56:58.419752 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:56:58.419756 | orchestrator | 2026-01-06 00:56:58.419760 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:56:58.419764 | orchestrator | testbed-node-0 : ok=134  changed=34  unreachable=0 failed=0 skipped=125  rescued=0 ignored=0 2026-01-06 00:56:58.419769 | orchestrator | testbed-node-1 : ok=127  changed=32  unreachable=0 failed=0 skipped=120  rescued=0 ignored=0 2026-01-06 00:56:58.419773 | orchestrator | testbed-node-2 : ok=134  changed=33  unreachable=0 failed=0 skipped=119  rescued=0 ignored=0 2026-01-06 00:56:58.419777 | orchestrator | testbed-node-3 : ok=193  changed=45  unreachable=0 failed=0 skipped=162  rescued=0 ignored=0 2026-01-06 00:56:58.419780 | orchestrator | testbed-node-4 : ok=175  changed=40  unreachable=0 failed=0 skipped=123  rescued=0 ignored=0 2026-01-06 00:56:58.419787 | orchestrator | testbed-node-5 : ok=177  changed=41  unreachable=0 failed=0 skipped=121  rescued=0 ignored=0 2026-01-06 00:56:58.419791 | orchestrator | 2026-01-06 00:56:58.419795 | orchestrator | 2026-01-06 00:56:58.419799 | orchestrator | 2026-01-06 00:56:58.419803 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:56:58.419807 | orchestrator | Tuesday 06 January 2026 00:56:56 +0000 (0:00:00.265) 0:11:39.162 ******* 2026-01-06 00:56:58.419815 | orchestrator | =============================================================================== 2026-01-06 00:56:58.419819 | orchestrator | ceph-container-common : Pulling Ceph container image ------------------- 57.92s 2026-01-06 00:56:58.419822 | orchestrator | ceph-osd : Use ceph-volume to create osds ------------------------------ 41.01s 2026-01-06 00:56:58.419826 | orchestrator | ceph-rgw : Create rgw pools -------------------------------------------- 31.54s 2026-01-06 00:56:58.419830 | orchestrator | ceph-mgr : Wait for all mgr to be up ----------------------------------- 30.34s 2026-01-06 00:56:58.419834 | orchestrator | ceph-mon : Waiting for the monitor(s) to form the quorum... ------------ 22.06s 2026-01-06 00:56:58.419837 | orchestrator | ceph-mon : Set cluster configs ----------------------------------------- 14.77s 2026-01-06 00:56:58.419841 | orchestrator | ceph-osd : Wait for all osd to be up ----------------------------------- 12.60s 2026-01-06 00:56:58.419845 | orchestrator | ceph-mgr : Create ceph mgr keyring(s) on a mon node -------------------- 10.68s 2026-01-06 00:56:58.419849 | orchestrator | ceph-mon : Fetch ceph initial keys -------------------------------------- 9.69s 2026-01-06 00:56:58.419852 | orchestrator | ceph-mds : Create filesystem pools -------------------------------------- 9.28s 2026-01-06 00:56:58.419856 | orchestrator | ceph-config : Create ceph initial directories --------------------------- 6.74s 2026-01-06 00:56:58.419860 | orchestrator | ceph-mgr : Disable ceph mgr enabled modules ----------------------------- 6.74s 2026-01-06 00:56:58.419863 | orchestrator | ceph-mgr : Add modules to ceph-mgr -------------------------------------- 5.46s 2026-01-06 00:56:58.419867 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 4.51s 2026-01-06 00:56:58.419871 | orchestrator | ceph-crash : Create client.crash keyring -------------------------------- 4.39s 2026-01-06 00:56:58.419875 | orchestrator | ceph-rgw : Create rgw keyrings ------------------------------------------ 4.26s 2026-01-06 00:56:58.419878 | orchestrator | ceph-osd : Systemd start osd -------------------------------------------- 4.17s 2026-01-06 00:56:58.419912 | orchestrator | ceph-mon : Copy admin keyring over to mons ------------------------------ 4.04s 2026-01-06 00:56:58.419916 | orchestrator | ceph-config : Generate Ceph file ---------------------------------------- 3.76s 2026-01-06 00:56:58.419920 | orchestrator | ceph-mds : Create ceph filesystem --------------------------------------- 3.54s 2026-01-06 00:56:58.419924 | orchestrator | 2026-01-06 00:56:58 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:56:58.419928 | orchestrator | 2026-01-06 00:56:58 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:56:58.419932 | orchestrator | 2026-01-06 00:56:58 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:56:58.419935 | orchestrator | 2026-01-06 00:56:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:01.459659 | orchestrator | 2026-01-06 00:57:01 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:01.460770 | orchestrator | 2026-01-06 00:57:01 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:01.462005 | orchestrator | 2026-01-06 00:57:01 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:01.462111 | orchestrator | 2026-01-06 00:57:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:04.510469 | orchestrator | 2026-01-06 00:57:04 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:04.511057 | orchestrator | 2026-01-06 00:57:04 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:04.511765 | orchestrator | 2026-01-06 00:57:04 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:04.511794 | orchestrator | 2026-01-06 00:57:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:07.549964 | orchestrator | 2026-01-06 00:57:07 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:07.551287 | orchestrator | 2026-01-06 00:57:07 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:07.552905 | orchestrator | 2026-01-06 00:57:07 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:07.552973 | orchestrator | 2026-01-06 00:57:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:10.601437 | orchestrator | 2026-01-06 00:57:10 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:10.602737 | orchestrator | 2026-01-06 00:57:10 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:10.606172 | orchestrator | 2026-01-06 00:57:10 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:10.606239 | orchestrator | 2026-01-06 00:57:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:13.655487 | orchestrator | 2026-01-06 00:57:13 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:13.655702 | orchestrator | 2026-01-06 00:57:13 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:13.656918 | orchestrator | 2026-01-06 00:57:13 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:13.656963 | orchestrator | 2026-01-06 00:57:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:16.704923 | orchestrator | 2026-01-06 00:57:16 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:16.706169 | orchestrator | 2026-01-06 00:57:16 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:16.707597 | orchestrator | 2026-01-06 00:57:16 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:16.707636 | orchestrator | 2026-01-06 00:57:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:19.754397 | orchestrator | 2026-01-06 00:57:19 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:19.756154 | orchestrator | 2026-01-06 00:57:19 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:19.759224 | orchestrator | 2026-01-06 00:57:19 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:19.759581 | orchestrator | 2026-01-06 00:57:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:22.800039 | orchestrator | 2026-01-06 00:57:22 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:22.800578 | orchestrator | 2026-01-06 00:57:22 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:22.804121 | orchestrator | 2026-01-06 00:57:22 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:22.804165 | orchestrator | 2026-01-06 00:57:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:25.851773 | orchestrator | 2026-01-06 00:57:25 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:25.853514 | orchestrator | 2026-01-06 00:57:25 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:25.862836 | orchestrator | 2026-01-06 00:57:25 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:25.862927 | orchestrator | 2026-01-06 00:57:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:28.903921 | orchestrator | 2026-01-06 00:57:28 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:28.905031 | orchestrator | 2026-01-06 00:57:28 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:28.906372 | orchestrator | 2026-01-06 00:57:28 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:28.906595 | orchestrator | 2026-01-06 00:57:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:31.955337 | orchestrator | 2026-01-06 00:57:31 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:31.956157 | orchestrator | 2026-01-06 00:57:31 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:31.957894 | orchestrator | 2026-01-06 00:57:31 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:31.958080 | orchestrator | 2026-01-06 00:57:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:35.011227 | orchestrator | 2026-01-06 00:57:35 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:35.013744 | orchestrator | 2026-01-06 00:57:35 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:35.016065 | orchestrator | 2026-01-06 00:57:35 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:35.016122 | orchestrator | 2026-01-06 00:57:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:38.067891 | orchestrator | 2026-01-06 00:57:38 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:38.069782 | orchestrator | 2026-01-06 00:57:38 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:38.072121 | orchestrator | 2026-01-06 00:57:38 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:38.072176 | orchestrator | 2026-01-06 00:57:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:41.126389 | orchestrator | 2026-01-06 00:57:41 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:41.128425 | orchestrator | 2026-01-06 00:57:41 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:41.130804 | orchestrator | 2026-01-06 00:57:41 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:41.130868 | orchestrator | 2026-01-06 00:57:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:44.187022 | orchestrator | 2026-01-06 00:57:44 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:44.189753 | orchestrator | 2026-01-06 00:57:44 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:44.191655 | orchestrator | 2026-01-06 00:57:44 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:44.191708 | orchestrator | 2026-01-06 00:57:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:47.247538 | orchestrator | 2026-01-06 00:57:47 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:47.248910 | orchestrator | 2026-01-06 00:57:47 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:47.251269 | orchestrator | 2026-01-06 00:57:47 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:47.251300 | orchestrator | 2026-01-06 00:57:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:50.291071 | orchestrator | 2026-01-06 00:57:50 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:50.292122 | orchestrator | 2026-01-06 00:57:50 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:50.293744 | orchestrator | 2026-01-06 00:57:50 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:50.293797 | orchestrator | 2026-01-06 00:57:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:53.338620 | orchestrator | 2026-01-06 00:57:53 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:53.341246 | orchestrator | 2026-01-06 00:57:53 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:53.343197 | orchestrator | 2026-01-06 00:57:53 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:53.343293 | orchestrator | 2026-01-06 00:57:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:56.393708 | orchestrator | 2026-01-06 00:57:56 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:56.397984 | orchestrator | 2026-01-06 00:57:56 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:56.400295 | orchestrator | 2026-01-06 00:57:56 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:56.400433 | orchestrator | 2026-01-06 00:57:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:57:59.451655 | orchestrator | 2026-01-06 00:57:59 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:57:59.452653 | orchestrator | 2026-01-06 00:57:59 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:57:59.454682 | orchestrator | 2026-01-06 00:57:59 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:57:59.454718 | orchestrator | 2026-01-06 00:57:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:02.511266 | orchestrator | 2026-01-06 00:58:02 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:02.512761 | orchestrator | 2026-01-06 00:58:02 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:58:02.514822 | orchestrator | 2026-01-06 00:58:02 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:02.514874 | orchestrator | 2026-01-06 00:58:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:05.575355 | orchestrator | 2026-01-06 00:58:05 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:05.577012 | orchestrator | 2026-01-06 00:58:05 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state STARTED 2026-01-06 00:58:05.583846 | orchestrator | 2026-01-06 00:58:05 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:05.583911 | orchestrator | 2026-01-06 00:58:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:08.637369 | orchestrator | 2026-01-06 00:58:08 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:08.639384 | orchestrator | 2026-01-06 00:58:08 | INFO  | Task 45492854-093f-4352-86c4-0f5264a79bbf is in state SUCCESS 2026-01-06 00:58:08.640999 | orchestrator | 2026-01-06 00:58:08.641050 | orchestrator | 2026-01-06 00:58:08.641068 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:58:08.641082 | orchestrator | 2026-01-06 00:58:08.641092 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:58:08.641102 | orchestrator | Tuesday 06 January 2026 00:55:08 +0000 (0:00:00.265) 0:00:00.265 ******* 2026-01-06 00:58:08.641113 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:08.641127 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:08.641138 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:08.641150 | orchestrator | 2026-01-06 00:58:08.641189 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:58:08.641198 | orchestrator | Tuesday 06 January 2026 00:55:08 +0000 (0:00:00.301) 0:00:00.566 ******* 2026-01-06 00:58:08.641205 | orchestrator | ok: [testbed-node-0] => (item=enable_opensearch_True) 2026-01-06 00:58:08.641213 | orchestrator | ok: [testbed-node-1] => (item=enable_opensearch_True) 2026-01-06 00:58:08.641219 | orchestrator | ok: [testbed-node-2] => (item=enable_opensearch_True) 2026-01-06 00:58:08.641226 | orchestrator | 2026-01-06 00:58:08.641233 | orchestrator | PLAY [Apply role opensearch] *************************************************** 2026-01-06 00:58:08.641240 | orchestrator | 2026-01-06 00:58:08.641246 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-01-06 00:58:08.641253 | orchestrator | Tuesday 06 January 2026 00:55:08 +0000 (0:00:00.476) 0:00:01.043 ******* 2026-01-06 00:58:08.641260 | orchestrator | included: /ansible/roles/opensearch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:08.641267 | orchestrator | 2026-01-06 00:58:08.641273 | orchestrator | TASK [opensearch : Setting sysctl values] ************************************** 2026-01-06 00:58:08.641280 | orchestrator | Tuesday 06 January 2026 00:55:09 +0000 (0:00:00.500) 0:00:01.544 ******* 2026-01-06 00:58:08.641287 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:58:08.641294 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:58:08.641301 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-01-06 00:58:08.641307 | orchestrator | 2026-01-06 00:58:08.641314 | orchestrator | TASK [opensearch : Ensuring config directories exist] ************************** 2026-01-06 00:58:08.641320 | orchestrator | Tuesday 06 January 2026 00:55:09 +0000 (0:00:00.710) 0:00:02.255 ******* 2026-01-06 00:58:08.641346 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641356 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641376 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641394 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641407 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641416 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641432 | orchestrator | 2026-01-06 00:58:08.641440 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-01-06 00:58:08.641469 | orchestrator | Tuesday 06 January 2026 00:55:11 +0000 (0:00:01.731) 0:00:03.986 ******* 2026-01-06 00:58:08.641481 | orchestrator | included: /ansible/roles/opensearch/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:08.641488 | orchestrator | 2026-01-06 00:58:08.641495 | orchestrator | TASK [service-cert-copy : opensearch | Copying over extra CA certificates] ***** 2026-01-06 00:58:08.641507 | orchestrator | Tuesday 06 January 2026 00:55:12 +0000 (0:00:00.503) 0:00:04.490 ******* 2026-01-06 00:58:08.641515 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641522 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641533 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.641541 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641561 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641569 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.641577 | orchestrator | 2026-01-06 00:58:08.641584 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS certificate] *** 2026-01-06 00:58:08.641591 | orchestrator | Tuesday 06 January 2026 00:55:14 +0000 (0:00:02.479) 0:00:06.970 ******* 2026-01-06 00:58:08.641603 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.641802 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.641819 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.641827 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.641835 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.641905 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:08.641920 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.641940 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.641947 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:08.641954 | orchestrator | 2026-01-06 00:58:08.641961 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS key] *** 2026-01-06 00:58:08.641968 | orchestrator | Tuesday 06 January 2026 00:55:15 +0000 (0:00:01.192) 0:00:08.162 ******* 2026-01-06 00:58:08.641975 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.641987 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.641994 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.642001 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.642066 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.642083 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.642103 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.642112 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:08.642130 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:08.642137 | orchestrator | 2026-01-06 00:58:08.642144 | orchestrator | TASK [opensearch : Copying over config.json files for services] **************** 2026-01-06 00:58:08.642151 | orchestrator | Tuesday 06 January 2026 00:55:17 +0000 (0:00:01.105) 0:00:09.268 ******* 2026-01-06 00:58:08.642158 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642172 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642180 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642190 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642203 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642216 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642223 | orchestrator | 2026-01-06 00:58:08.642230 | orchestrator | TASK [opensearch : Copying over opensearch service config file] **************** 2026-01-06 00:58:08.642237 | orchestrator | Tuesday 06 January 2026 00:55:19 +0000 (0:00:02.701) 0:00:11.970 ******* 2026-01-06 00:58:08.642244 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.642251 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:08.642260 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:08.642271 | orchestrator | 2026-01-06 00:58:08.642282 | orchestrator | TASK [opensearch : Copying over opensearch-dashboards config file] ************* 2026-01-06 00:58:08.642293 | orchestrator | Tuesday 06 January 2026 00:55:22 +0000 (0:00:03.202) 0:00:15.172 ******* 2026-01-06 00:58:08.642303 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.642314 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:08.642326 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:08.642336 | orchestrator | 2026-01-06 00:58:08.642346 | orchestrator | TASK [service-check-containers : opensearch | Check containers] **************** 2026-01-06 00:58:08.642356 | orchestrator | Tuesday 06 January 2026 00:55:25 +0000 (0:00:02.120) 0:00:17.293 ******* 2026-01-06 00:58:08.642367 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642391 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642402 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 00:58:08.642421 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642434 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642493 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-01-06 00:58:08.642502 | orchestrator | 2026-01-06 00:58:08.642509 | orchestrator | TASK [service-check-containers : opensearch | Notify handlers to restart containers] *** 2026-01-06 00:58:08.642516 | orchestrator | Tuesday 06 January 2026 00:55:27 +0000 (0:00:02.688) 0:00:19.982 ******* 2026-01-06 00:58:08.642523 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:58:08.642530 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:08.642537 | orchestrator | } 2026-01-06 00:58:08.642544 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:58:08.642550 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:08.642557 | orchestrator | } 2026-01-06 00:58:08.642564 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:58:08.642571 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:08.642577 | orchestrator | } 2026-01-06 00:58:08.642584 | orchestrator | 2026-01-06 00:58:08.642591 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:58:08.642602 | orchestrator | Tuesday 06 January 2026 00:55:28 +0000 (0:00:00.395) 0:00:20.377 ******* 2026-01-06 00:58:08.642609 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.642616 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.642630 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.642640 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.642652 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.642660 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:08.642667 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2025.1', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 00:58:08.642674 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2025.1', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-01-06 00:58:08.642687 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:08.642694 | orchestrator | 2026-01-06 00:58:08.642700 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-01-06 00:58:08.642707 | orchestrator | Tuesday 06 January 2026 00:55:29 +0000 (0:00:01.456) 0:00:21.834 ******* 2026-01-06 00:58:08.642717 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.642724 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:08.642731 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:08.642738 | orchestrator | 2026-01-06 00:58:08.642745 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-01-06 00:58:08.642752 | orchestrator | Tuesday 06 January 2026 00:55:29 +0000 (0:00:00.338) 0:00:22.172 ******* 2026-01-06 00:58:08.642758 | orchestrator | 2026-01-06 00:58:08.642766 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-01-06 00:58:08.642778 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:00.123) 0:00:22.296 ******* 2026-01-06 00:58:08.642794 | orchestrator | 2026-01-06 00:58:08.642808 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-01-06 00:58:08.642819 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:00.121) 0:00:22.418 ******* 2026-01-06 00:58:08.642830 | orchestrator | 2026-01-06 00:58:08.642842 | orchestrator | RUNNING HANDLER [opensearch : Disable shard allocation] ************************ 2026-01-06 00:58:08.642853 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:00.080) 0:00:22.499 ******* 2026-01-06 00:58:08.642864 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.642876 | orchestrator | 2026-01-06 00:58:08.642888 | orchestrator | RUNNING HANDLER [opensearch : Perform a flush] ********************************* 2026-01-06 00:58:08.642899 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:00.279) 0:00:22.778 ******* 2026-01-06 00:58:08.642912 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:08.642924 | orchestrator | 2026-01-06 00:58:08.642936 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch container] ******************** 2026-01-06 00:58:08.642948 | orchestrator | Tuesday 06 January 2026 00:55:30 +0000 (0:00:00.221) 0:00:23.000 ******* 2026-01-06 00:58:08.642960 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.642972 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:08.642979 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:08.642986 | orchestrator | 2026-01-06 00:58:08.642992 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch-dashboards container] ********* 2026-01-06 00:58:08.642999 | orchestrator | Tuesday 06 January 2026 00:56:33 +0000 (0:01:02.666) 0:01:25.667 ******* 2026-01-06 00:58:08.643006 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.643012 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:08.643019 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:08.643025 | orchestrator | 2026-01-06 00:58:08.643032 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-01-06 00:58:08.643039 | orchestrator | Tuesday 06 January 2026 00:57:56 +0000 (0:01:23.082) 0:02:48.749 ******* 2026-01-06 00:58:08.643058 | orchestrator | included: /ansible/roles/opensearch/tasks/post-config.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:08.643065 | orchestrator | 2026-01-06 00:58:08.643072 | orchestrator | TASK [opensearch : Wait for OpenSearch to become ready] ************************ 2026-01-06 00:58:08.643078 | orchestrator | Tuesday 06 January 2026 00:57:57 +0000 (0:00:00.530) 0:02:49.279 ******* 2026-01-06 00:58:08.643085 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:08.643092 | orchestrator | 2026-01-06 00:58:08.643099 | orchestrator | TASK [opensearch : Check if a log retention policy exists] ********************* 2026-01-06 00:58:08.643105 | orchestrator | Tuesday 06 January 2026 00:57:59 +0000 (0:00:02.493) 0:02:51.772 ******* 2026-01-06 00:58:08.643112 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:08.643122 | orchestrator | 2026-01-06 00:58:08.643138 | orchestrator | TASK [opensearch : Create new log retention policy] **************************** 2026-01-06 00:58:08.643152 | orchestrator | Tuesday 06 January 2026 00:58:01 +0000 (0:00:02.302) 0:02:54.075 ******* 2026-01-06 00:58:08.643162 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.643177 | orchestrator | 2026-01-06 00:58:08.643191 | orchestrator | TASK [opensearch : Apply retention policy to existing indices] ***************** 2026-01-06 00:58:08.643201 | orchestrator | Tuesday 06 January 2026 00:58:05 +0000 (0:00:03.302) 0:02:57.378 ******* 2026-01-06 00:58:08.643212 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:08.643223 | orchestrator | 2026-01-06 00:58:08.643235 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:58:08.643247 | orchestrator | testbed-node-0 : ok=19  changed=12  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2026-01-06 00:58:08.643260 | orchestrator | testbed-node-1 : ok=15  changed=10  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:58:08.643270 | orchestrator | testbed-node-2 : ok=15  changed=10  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-01-06 00:58:08.643276 | orchestrator | 2026-01-06 00:58:08.643283 | orchestrator | 2026-01-06 00:58:08.643291 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:58:08.643303 | orchestrator | Tuesday 06 January 2026 00:58:07 +0000 (0:00:02.472) 0:02:59.850 ******* 2026-01-06 00:58:08.643315 | orchestrator | =============================================================================== 2026-01-06 00:58:08.643326 | orchestrator | opensearch : Restart opensearch-dashboards container ------------------- 83.08s 2026-01-06 00:58:08.643336 | orchestrator | opensearch : Restart opensearch container ------------------------------ 62.67s 2026-01-06 00:58:08.643347 | orchestrator | opensearch : Create new log retention policy ---------------------------- 3.30s 2026-01-06 00:58:08.643358 | orchestrator | opensearch : Copying over opensearch service config file ---------------- 3.20s 2026-01-06 00:58:08.643370 | orchestrator | opensearch : Copying over config.json files for services ---------------- 2.70s 2026-01-06 00:58:08.643382 | orchestrator | service-check-containers : opensearch | Check containers ---------------- 2.69s 2026-01-06 00:58:08.643392 | orchestrator | opensearch : Wait for OpenSearch to become ready ------------------------ 2.49s 2026-01-06 00:58:08.643402 | orchestrator | service-cert-copy : opensearch | Copying over extra CA certificates ----- 2.48s 2026-01-06 00:58:08.643415 | orchestrator | opensearch : Apply retention policy to existing indices ----------------- 2.47s 2026-01-06 00:58:08.643422 | orchestrator | opensearch : Check if a log retention policy exists --------------------- 2.30s 2026-01-06 00:58:08.643428 | orchestrator | opensearch : Copying over opensearch-dashboards config file ------------- 2.12s 2026-01-06 00:58:08.643435 | orchestrator | opensearch : Ensuring config directories exist -------------------------- 1.73s 2026-01-06 00:58:08.643461 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.46s 2026-01-06 00:58:08.643474 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS certificate --- 1.19s 2026-01-06 00:58:08.643496 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS key --- 1.11s 2026-01-06 00:58:08.643510 | orchestrator | opensearch : Setting sysctl values -------------------------------------- 0.71s 2026-01-06 00:58:08.643520 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.53s 2026-01-06 00:58:08.643531 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.50s 2026-01-06 00:58:08.643541 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.50s 2026-01-06 00:58:08.643552 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.48s 2026-01-06 00:58:08.643563 | orchestrator | 2026-01-06 00:58:08 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:08.643576 | orchestrator | 2026-01-06 00:58:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:11.686393 | orchestrator | 2026-01-06 00:58:11 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:11.688971 | orchestrator | 2026-01-06 00:58:11 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:11.689076 | orchestrator | 2026-01-06 00:58:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:14.735095 | orchestrator | 2026-01-06 00:58:14 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:14.736016 | orchestrator | 2026-01-06 00:58:14 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:14.736055 | orchestrator | 2026-01-06 00:58:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:17.782799 | orchestrator | 2026-01-06 00:58:17 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:17.783089 | orchestrator | 2026-01-06 00:58:17 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:17.783136 | orchestrator | 2026-01-06 00:58:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:20.831166 | orchestrator | 2026-01-06 00:58:20 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:20.832691 | orchestrator | 2026-01-06 00:58:20 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:20.832736 | orchestrator | 2026-01-06 00:58:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:23.873080 | orchestrator | 2026-01-06 00:58:23 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:23.874920 | orchestrator | 2026-01-06 00:58:23 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:23.875266 | orchestrator | 2026-01-06 00:58:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:26.919982 | orchestrator | 2026-01-06 00:58:26 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:26.921609 | orchestrator | 2026-01-06 00:58:26 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:26.921754 | orchestrator | 2026-01-06 00:58:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:29.969344 | orchestrator | 2026-01-06 00:58:29 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:29.969970 | orchestrator | 2026-01-06 00:58:29 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:29.970008 | orchestrator | 2026-01-06 00:58:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:33.011936 | orchestrator | 2026-01-06 00:58:33 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:33.012498 | orchestrator | 2026-01-06 00:58:33 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:33.015045 | orchestrator | 2026-01-06 00:58:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:36.056098 | orchestrator | 2026-01-06 00:58:36 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state STARTED 2026-01-06 00:58:36.059233 | orchestrator | 2026-01-06 00:58:36 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:36.059324 | orchestrator | 2026-01-06 00:58:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:39.103797 | orchestrator | 2026-01-06 00:58:39 | INFO  | Task 62f5edf8-3c57-4053-a390-66815683cd5b is in state SUCCESS 2026-01-06 00:58:39.105899 | orchestrator | 2026-01-06 00:58:39.106110 | orchestrator | 2026-01-06 00:58:39.106136 | orchestrator | PLAY [Set kolla_action_mariadb] ************************************************ 2026-01-06 00:58:39.106174 | orchestrator | 2026-01-06 00:58:39.106187 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-01-06 00:58:39.106199 | orchestrator | Tuesday 06 January 2026 00:55:07 +0000 (0:00:00.092) 0:00:00.092 ******* 2026-01-06 00:58:39.106210 | orchestrator | ok: [localhost] => { 2026-01-06 00:58:39.106223 | orchestrator |  "msg": "The task 'Check MariaDB service' fails if the MariaDB service has not yet been deployed. This is fine." 2026-01-06 00:58:39.106234 | orchestrator | } 2026-01-06 00:58:39.106246 | orchestrator | 2026-01-06 00:58:39.106258 | orchestrator | TASK [Check MariaDB service] *************************************************** 2026-01-06 00:58:39.106270 | orchestrator | Tuesday 06 January 2026 00:55:07 +0000 (0:00:00.068) 0:00:00.160 ******* 2026-01-06 00:58:39.106281 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.9:3306"} 2026-01-06 00:58:39.106295 | orchestrator | ...ignoring 2026-01-06 00:58:39.106307 | orchestrator | 2026-01-06 00:58:39.106318 | orchestrator | TASK [Set kolla_action_mariadb = upgrade if MariaDB is already running] ******** 2026-01-06 00:58:39.106329 | orchestrator | Tuesday 06 January 2026 00:55:10 +0000 (0:00:02.868) 0:00:03.029 ******* 2026-01-06 00:58:39.106340 | orchestrator | skipping: [localhost] 2026-01-06 00:58:39.106351 | orchestrator | 2026-01-06 00:58:39.106363 | orchestrator | TASK [Set kolla_action_mariadb = kolla_action_ng] ****************************** 2026-01-06 00:58:39.106373 | orchestrator | Tuesday 06 January 2026 00:55:10 +0000 (0:00:00.069) 0:00:03.099 ******* 2026-01-06 00:58:39.106385 | orchestrator | ok: [localhost] 2026-01-06 00:58:39.106396 | orchestrator | 2026-01-06 00:58:39.106407 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:58:39.106529 | orchestrator | 2026-01-06 00:58:39.106544 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:58:39.106557 | orchestrator | Tuesday 06 January 2026 00:55:11 +0000 (0:00:00.177) 0:00:03.277 ******* 2026-01-06 00:58:39.106570 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.106583 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.106596 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.106610 | orchestrator | 2026-01-06 00:58:39.106623 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:58:39.106715 | orchestrator | Tuesday 06 January 2026 00:55:11 +0000 (0:00:00.315) 0:00:03.592 ******* 2026-01-06 00:58:39.106728 | orchestrator | ok: [testbed-node-0] => (item=enable_mariadb_True) 2026-01-06 00:58:39.106739 | orchestrator | ok: [testbed-node-1] => (item=enable_mariadb_True) 2026-01-06 00:58:39.106750 | orchestrator | ok: [testbed-node-2] => (item=enable_mariadb_True) 2026-01-06 00:58:39.106761 | orchestrator | 2026-01-06 00:58:39.106772 | orchestrator | PLAY [Apply role mariadb] ****************************************************** 2026-01-06 00:58:39.106783 | orchestrator | 2026-01-06 00:58:39.106794 | orchestrator | TASK [mariadb : Group MariaDB hosts based on shards] *************************** 2026-01-06 00:58:39.106805 | orchestrator | Tuesday 06 January 2026 00:55:11 +0000 (0:00:00.494) 0:00:04.087 ******* 2026-01-06 00:58:39.106816 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-01-06 00:58:39.106862 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-01-06 00:58:39.106875 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-01-06 00:58:39.106886 | orchestrator | 2026-01-06 00:58:39.106897 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-01-06 00:58:39.106908 | orchestrator | Tuesday 06 January 2026 00:55:12 +0000 (0:00:00.350) 0:00:04.437 ******* 2026-01-06 00:58:39.106919 | orchestrator | included: /ansible/roles/mariadb/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:39.106932 | orchestrator | 2026-01-06 00:58:39.106943 | orchestrator | TASK [mariadb : Ensuring config directories exist] ***************************** 2026-01-06 00:58:39.106955 | orchestrator | Tuesday 06 January 2026 00:55:12 +0000 (0:00:00.681) 0:00:05.118 ******* 2026-01-06 00:58:39.107041 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107062 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107091 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107104 | orchestrator | 2026-01-06 00:58:39.107151 | orchestrator | TASK [mariadb : Ensuring database backup config directory exists] ************** 2026-01-06 00:58:39.107165 | orchestrator | Tuesday 06 January 2026 00:55:15 +0000 (0:00:02.571) 0:00:07.689 ******* 2026-01-06 00:58:39.107176 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.107189 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.107200 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.107211 | orchestrator | 2026-01-06 00:58:39.107222 | orchestrator | TASK [mariadb : Copying over my.cnf for mariabackup] *************************** 2026-01-06 00:58:39.107233 | orchestrator | Tuesday 06 January 2026 00:55:16 +0000 (0:00:00.672) 0:00:08.362 ******* 2026-01-06 00:58:39.107244 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.107255 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.107266 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.107277 | orchestrator | 2026-01-06 00:58:39.107288 | orchestrator | TASK [mariadb : Copying over config.json files for services] ******************* 2026-01-06 00:58:39.107299 | orchestrator | Tuesday 06 January 2026 00:55:17 +0000 (0:00:01.736) 0:00:10.098 ******* 2026-01-06 00:58:39.107328 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107363 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107393 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.107579 | orchestrator | 2026-01-06 00:58:39.107621 | orchestrator | TASK [mariadb : Copying over config.json files for mariabackup] **************** 2026-01-06 00:58:39.107643 | orchestrator | Tuesday 06 January 2026 00:55:22 +0000 (0:00:05.017) 0:00:15.116 ******* 2026-01-06 00:58:39.107659 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.107670 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.107681 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.107700 | orchestrator | 2026-01-06 00:58:39.107714 | orchestrator | TASK [mariadb : Copying over galera.cnf] *************************************** 2026-01-06 00:58:39.107725 | orchestrator | Tuesday 06 January 2026 00:55:24 +0000 (0:00:01.158) 0:00:16.275 ******* 2026-01-06 00:58:39.107737 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.107756 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:39.107770 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:39.107781 | orchestrator | 2026-01-06 00:58:39.107792 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-01-06 00:58:39.107803 | orchestrator | Tuesday 06 January 2026 00:55:28 +0000 (0:00:04.930) 0:00:21.205 ******* 2026-01-06 00:58:39.107814 | orchestrator | included: /ansible/roles/mariadb/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:39.107825 | orchestrator | 2026-01-06 00:58:39.107836 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-01-06 00:58:39.107847 | orchestrator | Tuesday 06 January 2026 00:55:29 +0000 (0:00:00.631) 0:00:21.836 ******* 2026-01-06 00:58:39.107959 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.107986 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.107999 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108012 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108034 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108046 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108055 | orchestrator | 2026-01-06 00:58:39.108065 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-01-06 00:58:39.108081 | orchestrator | Tuesday 06 January 2026 00:55:32 +0000 (0:00:03.207) 0:00:25.044 ******* 2026-01-06 00:58:39.108091 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108112 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108124 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108133 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108148 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108171 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108181 | orchestrator | 2026-01-06 00:58:39.108191 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-01-06 00:58:39.108200 | orchestrator | Tuesday 06 January 2026 00:55:35 +0000 (0:00:02.417) 0:00:27.462 ******* 2026-01-06 00:58:39.108216 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108227 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108245 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108271 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108282 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108292 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108302 | orchestrator | 2026-01-06 00:58:39.108311 | orchestrator | TASK [service-check-containers : mariadb | Check containers] ******************* 2026-01-06 00:58:39.108322 | orchestrator | Tuesday 06 January 2026 00:55:38 +0000 (0:00:03.057) 0:00:30.519 ******* 2026-01-06 00:58:39.108345 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.108363 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.108387 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-01-06 00:58:39.108405 | orchestrator | 2026-01-06 00:58:39.108448 | orchestrator | TASK [service-check-containers : mariadb | Notify handlers to restart containers] *** 2026-01-06 00:58:39.108460 | orchestrator | Tuesday 06 January 2026 00:55:42 +0000 (0:00:04.258) 0:00:34.777 ******* 2026-01-06 00:58:39.108470 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:58:39.108479 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:39.108489 | orchestrator | } 2026-01-06 00:58:39.108500 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:58:39.108509 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:39.108520 | orchestrator | } 2026-01-06 00:58:39.108529 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:58:39.108539 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:58:39.108549 | orchestrator | } 2026-01-06 00:58:39.108559 | orchestrator | 2026-01-06 00:58:39.108569 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:58:39.108579 | orchestrator | Tuesday 06 January 2026 00:55:43 +0000 (0:00:00.692) 0:00:35.470 ******* 2026-01-06 00:58:39.108589 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108608 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108632 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108644 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108654 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.108665 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108677 | orchestrator | 2026-01-06 00:58:39.108694 | orchestrator | TASK [mariadb : Checking for mariadb cluster] ********************************** 2026-01-06 00:58:39.108711 | orchestrator | Tuesday 06 January 2026 00:55:46 +0000 (0:00:03.210) 0:00:38.680 ******* 2026-01-06 00:58:39.108721 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108731 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108741 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108750 | orchestrator | 2026-01-06 00:58:39.108760 | orchestrator | TASK [mariadb : Cleaning up temp file on localhost] **************************** 2026-01-06 00:58:39.108770 | orchestrator | Tuesday 06 January 2026 00:55:46 +0000 (0:00:00.335) 0:00:39.015 ******* 2026-01-06 00:58:39.108780 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108790 | orchestrator | 2026-01-06 00:58:39.108799 | orchestrator | TASK [mariadb : Stop MariaDB containers] *************************************** 2026-01-06 00:58:39.108809 | orchestrator | Tuesday 06 January 2026 00:55:46 +0000 (0:00:00.136) 0:00:39.152 ******* 2026-01-06 00:58:39.108819 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108829 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108842 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108852 | orchestrator | 2026-01-06 00:58:39.108862 | orchestrator | TASK [mariadb : Run MariaDB wsrep recovery] ************************************ 2026-01-06 00:58:39.108872 | orchestrator | Tuesday 06 January 2026 00:55:47 +0000 (0:00:00.586) 0:00:39.739 ******* 2026-01-06 00:58:39.108888 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108898 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108907 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108917 | orchestrator | 2026-01-06 00:58:39.108927 | orchestrator | TASK [mariadb : Copying MariaDB log file to /tmp] ****************************** 2026-01-06 00:58:39.108936 | orchestrator | Tuesday 06 January 2026 00:55:47 +0000 (0:00:00.375) 0:00:40.114 ******* 2026-01-06 00:58:39.108946 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.108956 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.108966 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.108976 | orchestrator | 2026-01-06 00:58:39.108985 | orchestrator | TASK [mariadb : Get MariaDB wsrep recovery seqno] ****************************** 2026-01-06 00:58:39.108995 | orchestrator | Tuesday 06 January 2026 00:55:48 +0000 (0:00:00.410) 0:00:40.524 ******* 2026-01-06 00:58:39.109005 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109015 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109025 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109035 | orchestrator | 2026-01-06 00:58:39.109045 | orchestrator | TASK [mariadb : Removing MariaDB log file from /tmp] *************************** 2026-01-06 00:58:39.109055 | orchestrator | Tuesday 06 January 2026 00:55:48 +0000 (0:00:00.377) 0:00:40.902 ******* 2026-01-06 00:58:39.109064 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109074 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109084 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109094 | orchestrator | 2026-01-06 00:58:39.109104 | orchestrator | TASK [mariadb : Registering MariaDB seqno variable] **************************** 2026-01-06 00:58:39.109114 | orchestrator | Tuesday 06 January 2026 00:55:49 +0000 (0:00:00.690) 0:00:41.593 ******* 2026-01-06 00:58:39.109124 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109134 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109144 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109153 | orchestrator | 2026-01-06 00:58:39.109163 | orchestrator | TASK [mariadb : Comparing seqno value on all mariadb hosts] ******************** 2026-01-06 00:58:39.109173 | orchestrator | Tuesday 06 January 2026 00:55:49 +0000 (0:00:00.368) 0:00:41.961 ******* 2026-01-06 00:58:39.109184 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-01-06 00:58:39.109194 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-01-06 00:58:39.109204 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-01-06 00:58:39.109213 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109223 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-01-06 00:58:39.109233 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-01-06 00:58:39.109248 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-01-06 00:58:39.109258 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109271 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-01-06 00:58:39.109287 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-01-06 00:58:39.109308 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-01-06 00:58:39.109331 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109347 | orchestrator | 2026-01-06 00:58:39.109363 | orchestrator | TASK [mariadb : Writing hostname of host with the largest seqno to temp file] *** 2026-01-06 00:58:39.109378 | orchestrator | Tuesday 06 January 2026 00:55:50 +0000 (0:00:00.393) 0:00:42.355 ******* 2026-01-06 00:58:39.109438 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109456 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109473 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109488 | orchestrator | 2026-01-06 00:58:39.109506 | orchestrator | TASK [mariadb : Registering mariadb_recover_inventory_name from temp file] ***** 2026-01-06 00:58:39.109522 | orchestrator | Tuesday 06 January 2026 00:55:50 +0000 (0:00:00.360) 0:00:42.716 ******* 2026-01-06 00:58:39.109537 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109547 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109557 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109567 | orchestrator | 2026-01-06 00:58:39.109576 | orchestrator | TASK [mariadb : Store bootstrap and master hostnames into facts] *************** 2026-01-06 00:58:39.109586 | orchestrator | Tuesday 06 January 2026 00:55:51 +0000 (0:00:00.558) 0:00:43.274 ******* 2026-01-06 00:58:39.109596 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109606 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109615 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109625 | orchestrator | 2026-01-06 00:58:39.109635 | orchestrator | TASK [mariadb : Set grastate.dat file from MariaDB container in bootstrap host] *** 2026-01-06 00:58:39.109645 | orchestrator | Tuesday 06 January 2026 00:55:51 +0000 (0:00:00.334) 0:00:43.609 ******* 2026-01-06 00:58:39.109656 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109665 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109675 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109685 | orchestrator | 2026-01-06 00:58:39.109695 | orchestrator | TASK [mariadb : Starting first MariaDB container] ****************************** 2026-01-06 00:58:39.109705 | orchestrator | Tuesday 06 January 2026 00:55:51 +0000 (0:00:00.362) 0:00:43.972 ******* 2026-01-06 00:58:39.109715 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109725 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109735 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109744 | orchestrator | 2026-01-06 00:58:39.109754 | orchestrator | TASK [mariadb : Wait for first MariaDB container] ****************************** 2026-01-06 00:58:39.109764 | orchestrator | Tuesday 06 January 2026 00:55:52 +0000 (0:00:00.313) 0:00:44.285 ******* 2026-01-06 00:58:39.109774 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109783 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109793 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109803 | orchestrator | 2026-01-06 00:58:39.109813 | orchestrator | TASK [mariadb : Set first MariaDB container as primary] ************************ 2026-01-06 00:58:39.109823 | orchestrator | Tuesday 06 January 2026 00:55:52 +0000 (0:00:00.364) 0:00:44.650 ******* 2026-01-06 00:58:39.109839 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109849 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109859 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109869 | orchestrator | 2026-01-06 00:58:39.109878 | orchestrator | TASK [mariadb : Wait for MariaDB to become operational] ************************ 2026-01-06 00:58:39.109896 | orchestrator | Tuesday 06 January 2026 00:55:53 +0000 (0:00:00.658) 0:00:45.308 ******* 2026-01-06 00:58:39.109907 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.109917 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.109927 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.109947 | orchestrator | 2026-01-06 00:58:39.109957 | orchestrator | TASK [mariadb : Restart slave MariaDB container(s)] **************************** 2026-01-06 00:58:39.109967 | orchestrator | Tuesday 06 January 2026 00:55:53 +0000 (0:00:00.364) 0:00:45.672 ******* 2026-01-06 00:58:39.109978 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.109990 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110005 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.110065 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110095 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.110108 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110118 | orchestrator | 2026-01-06 00:58:39.110128 | orchestrator | TASK [mariadb : Wait for slave MariaDB] **************************************** 2026-01-06 00:58:39.110139 | orchestrator | Tuesday 06 January 2026 00:55:55 +0000 (0:00:02.495) 0:00:48.168 ******* 2026-01-06 00:58:39.110149 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110159 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110169 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110236 | orchestrator | 2026-01-06 00:58:39.110252 | orchestrator | TASK [mariadb : Restart master MariaDB container(s)] *************************** 2026-01-06 00:58:39.110268 | orchestrator | Tuesday 06 January 2026 00:55:56 +0000 (0:00:00.351) 0:00:48.520 ******* 2026-01-06 00:58:39.110293 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.110335 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110355 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.110372 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110384 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2025.1', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-01-06 00:58:39.110401 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110432 | orchestrator | 2026-01-06 00:58:39.110444 | orchestrator | TASK [mariadb : Wait for master mariadb] *************************************** 2026-01-06 00:58:39.110454 | orchestrator | Tuesday 06 January 2026 00:55:58 +0000 (0:00:02.426) 0:00:50.946 ******* 2026-01-06 00:58:39.110480 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110490 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110501 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110512 | orchestrator | 2026-01-06 00:58:39.110523 | orchestrator | TASK [service-check : mariadb | Get container facts] *************************** 2026-01-06 00:58:39.110542 | orchestrator | Tuesday 06 January 2026 00:55:59 +0000 (0:00:00.379) 0:00:51.325 ******* 2026-01-06 00:58:39.110557 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110576 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110595 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110613 | orchestrator | 2026-01-06 00:58:39.110624 | orchestrator | TASK [service-check : mariadb | Fail if containers are missing or not running] *** 2026-01-06 00:58:39.110636 | orchestrator | Tuesday 06 January 2026 00:55:59 +0000 (0:00:00.392) 0:00:51.718 ******* 2026-01-06 00:58:39.110647 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110659 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110670 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110681 | orchestrator | 2026-01-06 00:58:39.110692 | orchestrator | TASK [service-check : mariadb | Fail if containers are unhealthy] ************** 2026-01-06 00:58:39.110703 | orchestrator | Tuesday 06 January 2026 00:55:59 +0000 (0:00:00.411) 0:00:52.129 ******* 2026-01-06 00:58:39.110714 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110725 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110735 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110746 | orchestrator | 2026-01-06 00:58:39.110757 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2026-01-06 00:58:39.110815 | orchestrator | Tuesday 06 January 2026 00:56:00 +0000 (0:00:00.780) 0:00:52.910 ******* 2026-01-06 00:58:39.110826 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.110837 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.110848 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.110859 | orchestrator | 2026-01-06 00:58:39.110870 | orchestrator | TASK [mariadb : Create MariaDB volume] ***************************************** 2026-01-06 00:58:39.110881 | orchestrator | Tuesday 06 January 2026 00:56:01 +0000 (0:00:00.347) 0:00:53.258 ******* 2026-01-06 00:58:39.110892 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.110903 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:39.110914 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:39.110925 | orchestrator | 2026-01-06 00:58:39.110936 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB volume availability] ************* 2026-01-06 00:58:39.110948 | orchestrator | Tuesday 06 January 2026 00:56:01 +0000 (0:00:00.810) 0:00:54.068 ******* 2026-01-06 00:58:39.110959 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.110971 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.110982 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.110993 | orchestrator | 2026-01-06 00:58:39.111004 | orchestrator | TASK [mariadb : Establish whether the cluster has already existed] ************* 2026-01-06 00:58:39.111015 | orchestrator | Tuesday 06 January 2026 00:56:02 +0000 (0:00:00.640) 0:00:54.709 ******* 2026-01-06 00:58:39.111026 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.111037 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.111048 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.111058 | orchestrator | 2026-01-06 00:58:39.111069 | orchestrator | TASK [mariadb : Check MariaDB service port liveness] *************************** 2026-01-06 00:58:39.111080 | orchestrator | Tuesday 06 January 2026 00:56:02 +0000 (0:00:00.342) 0:00:55.051 ******* 2026-01-06 00:58:39.111093 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.10:3306"} 2026-01-06 00:58:39.111114 | orchestrator | ...ignoring 2026-01-06 00:58:39.111125 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.11:3306"} 2026-01-06 00:58:39.111137 | orchestrator | ...ignoring 2026-01-06 00:58:39.111148 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.12:3306"} 2026-01-06 00:58:39.111159 | orchestrator | ...ignoring 2026-01-06 00:58:39.111171 | orchestrator | 2026-01-06 00:58:39.111182 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service port liveness] *********** 2026-01-06 00:58:39.111193 | orchestrator | Tuesday 06 January 2026 00:56:13 +0000 (0:00:10.735) 0:01:05.787 ******* 2026-01-06 00:58:39.111204 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.111215 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.111226 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.111237 | orchestrator | 2026-01-06 00:58:39.111248 | orchestrator | TASK [mariadb : Fail on existing but stopped cluster] ************************** 2026-01-06 00:58:39.111259 | orchestrator | Tuesday 06 January 2026 00:56:13 +0000 (0:00:00.366) 0:01:06.154 ******* 2026-01-06 00:58:39.111270 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.111286 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.111305 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.111324 | orchestrator | 2026-01-06 00:58:39.111343 | orchestrator | TASK [mariadb : Check MariaDB service WSREP sync status] *********************** 2026-01-06 00:58:39.111362 | orchestrator | Tuesday 06 January 2026 00:56:14 +0000 (0:00:00.586) 0:01:06.740 ******* 2026-01-06 00:58:39.111380 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.111399 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.111441 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.111459 | orchestrator | 2026-01-06 00:58:39.111479 | orchestrator | TASK [mariadb : Extract MariaDB service WSREP sync status] ********************* 2026-01-06 00:58:39.111542 | orchestrator | Tuesday 06 January 2026 00:56:14 +0000 (0:00:00.381) 0:01:07.121 ******* 2026-01-06 00:58:39.111554 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.111566 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.111577 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.111668 | orchestrator | 2026-01-06 00:58:39.111680 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service WSREP sync status] ******* 2026-01-06 00:58:39.111691 | orchestrator | Tuesday 06 January 2026 00:56:15 +0000 (0:00:00.336) 0:01:07.457 ******* 2026-01-06 00:58:39.111702 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.111718 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.111736 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.111756 | orchestrator | 2026-01-06 00:58:39.111785 | orchestrator | TASK [mariadb : Fail when MariaDB services are not synced across the whole cluster] *** 2026-01-06 00:58:39.111797 | orchestrator | Tuesday 06 January 2026 00:56:15 +0000 (0:00:00.364) 0:01:07.822 ******* 2026-01-06 00:58:39.111808 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.111828 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.111840 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.111850 | orchestrator | 2026-01-06 00:58:39.111861 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-01-06 00:58:39.111872 | orchestrator | Tuesday 06 January 2026 00:56:16 +0000 (0:00:00.567) 0:01:08.389 ******* 2026-01-06 00:58:39.111883 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.111894 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.111905 | orchestrator | included: /ansible/roles/mariadb/tasks/bootstrap_cluster.yml for testbed-node-0 2026-01-06 00:58:39.111916 | orchestrator | 2026-01-06 00:58:39.111927 | orchestrator | TASK [mariadb : Running MariaDB bootstrap container] *************************** 2026-01-06 00:58:39.111938 | orchestrator | Tuesday 06 January 2026 00:56:16 +0000 (0:00:00.426) 0:01:08.816 ******* 2026-01-06 00:58:39.111949 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.111969 | orchestrator | 2026-01-06 00:58:39.111980 | orchestrator | TASK [mariadb : Store bootstrap host name into facts] ************************** 2026-01-06 00:58:39.111991 | orchestrator | Tuesday 06 January 2026 00:56:27 +0000 (0:00:10.519) 0:01:19.336 ******* 2026-01-06 00:58:39.112002 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.112013 | orchestrator | 2026-01-06 00:58:39.112024 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-01-06 00:58:39.112035 | orchestrator | Tuesday 06 January 2026 00:56:27 +0000 (0:00:00.137) 0:01:19.474 ******* 2026-01-06 00:58:39.112070 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.112082 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.112093 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.112104 | orchestrator | 2026-01-06 00:58:39.112115 | orchestrator | RUNNING HANDLER [mariadb : Starting first MariaDB container] ******************* 2026-01-06 00:58:39.112126 | orchestrator | Tuesday 06 January 2026 00:56:28 +0000 (0:00:00.968) 0:01:20.442 ******* 2026-01-06 00:58:39.112137 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.112148 | orchestrator | 2026-01-06 00:58:39.112159 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service port liveness] ******* 2026-01-06 00:58:39.112170 | orchestrator | Tuesday 06 January 2026 00:56:36 +0000 (0:00:07.970) 0:01:28.413 ******* 2026-01-06 00:58:39.112181 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.112192 | orchestrator | 2026-01-06 00:58:39.112203 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service to sync WSREP] ******* 2026-01-06 00:58:39.112213 | orchestrator | Tuesday 06 January 2026 00:56:37 +0000 (0:00:01.674) 0:01:30.088 ******* 2026-01-06 00:58:39.112225 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.112235 | orchestrator | 2026-01-06 00:58:39.112246 | orchestrator | RUNNING HANDLER [mariadb : Ensure MariaDB is running normally on bootstrap host] *** 2026-01-06 00:58:39.112258 | orchestrator | Tuesday 06 January 2026 00:56:40 +0000 (0:00:02.437) 0:01:32.525 ******* 2026-01-06 00:58:39.112269 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.112280 | orchestrator | 2026-01-06 00:58:39.112291 | orchestrator | RUNNING HANDLER [mariadb : Restart MariaDB on existing cluster members] ******** 2026-01-06 00:58:39.112302 | orchestrator | Tuesday 06 January 2026 00:56:40 +0000 (0:00:00.142) 0:01:32.668 ******* 2026-01-06 00:58:39.112312 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.112323 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.112334 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.112345 | orchestrator | 2026-01-06 00:58:39.112356 | orchestrator | RUNNING HANDLER [mariadb : Start MariaDB on new nodes] ************************* 2026-01-06 00:58:39.112367 | orchestrator | Tuesday 06 January 2026 00:56:40 +0000 (0:00:00.368) 0:01:33.037 ******* 2026-01-06 00:58:39.112378 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.112389 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: mariadb_restart 2026-01-06 00:58:39.112400 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:39.112443 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:39.112465 | orchestrator | 2026-01-06 00:58:39.112484 | orchestrator | PLAY [Restart mariadb services] ************************************************ 2026-01-06 00:58:39.112504 | orchestrator | skipping: no hosts matched 2026-01-06 00:58:39.112521 | orchestrator | 2026-01-06 00:58:39.112540 | orchestrator | PLAY [Start mariadb services] ************************************************** 2026-01-06 00:58:39.112552 | orchestrator | 2026-01-06 00:58:39.112562 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2026-01-06 00:58:39.112574 | orchestrator | Tuesday 06 January 2026 00:56:41 +0000 (0:00:00.623) 0:01:33.660 ******* 2026-01-06 00:58:39.112584 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:58:39.112595 | orchestrator | 2026-01-06 00:58:39.112606 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2026-01-06 00:58:39.112617 | orchestrator | Tuesday 06 January 2026 00:56:59 +0000 (0:00:18.407) 0:01:52.067 ******* 2026-01-06 00:58:39.112628 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.112640 | orchestrator | 2026-01-06 00:58:39.112651 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2026-01-06 00:58:39.112668 | orchestrator | Tuesday 06 January 2026 00:57:16 +0000 (0:00:16.678) 0:02:08.746 ******* 2026-01-06 00:58:39.112679 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.112690 | orchestrator | 2026-01-06 00:58:39.112701 | orchestrator | PLAY [Start mariadb services] ************************************************** 2026-01-06 00:58:39.112712 | orchestrator | 2026-01-06 00:58:39.112723 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2026-01-06 00:58:39.112734 | orchestrator | Tuesday 06 January 2026 00:57:18 +0000 (0:00:02.288) 0:02:11.035 ******* 2026-01-06 00:58:39.112744 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:58:39.112755 | orchestrator | 2026-01-06 00:58:39.112766 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2026-01-06 00:58:39.112777 | orchestrator | Tuesday 06 January 2026 00:57:38 +0000 (0:00:19.699) 0:02:30.735 ******* 2026-01-06 00:58:39.112788 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.112799 | orchestrator | 2026-01-06 00:58:39.112811 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2026-01-06 00:58:39.112828 | orchestrator | Tuesday 06 January 2026 00:57:55 +0000 (0:00:16.583) 0:02:47.318 ******* 2026-01-06 00:58:39.112839 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.112850 | orchestrator | 2026-01-06 00:58:39.112861 | orchestrator | PLAY [Restart bootstrap mariadb service] *************************************** 2026-01-06 00:58:39.112872 | orchestrator | 2026-01-06 00:58:39.112890 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2026-01-06 00:58:39.112901 | orchestrator | Tuesday 06 January 2026 00:57:57 +0000 (0:00:02.161) 0:02:49.480 ******* 2026-01-06 00:58:39.112913 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.112924 | orchestrator | 2026-01-06 00:58:39.112935 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2026-01-06 00:58:39.112946 | orchestrator | Tuesday 06 January 2026 00:58:14 +0000 (0:00:17.352) 0:03:06.832 ******* 2026-01-06 00:58:39.112957 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.112968 | orchestrator | 2026-01-06 00:58:39.112979 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2026-01-06 00:58:39.112990 | orchestrator | Tuesday 06 January 2026 00:58:15 +0000 (0:00:00.604) 0:03:07.437 ******* 2026-01-06 00:58:39.113001 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.113012 | orchestrator | 2026-01-06 00:58:39.113023 | orchestrator | PLAY [Apply mariadb post-configuration] **************************************** 2026-01-06 00:58:39.113035 | orchestrator | 2026-01-06 00:58:39.113046 | orchestrator | TASK [Include mariadb post-deploy.yml] ***************************************** 2026-01-06 00:58:39.113057 | orchestrator | Tuesday 06 January 2026 00:58:17 +0000 (0:00:02.425) 0:03:09.862 ******* 2026-01-06 00:58:39.113068 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:58:39.113079 | orchestrator | 2026-01-06 00:58:39.113090 | orchestrator | TASK [mariadb : Creating shard root mysql user] ******************************** 2026-01-06 00:58:39.113101 | orchestrator | Tuesday 06 January 2026 00:58:18 +0000 (0:00:00.573) 0:03:10.436 ******* 2026-01-06 00:58:39.113112 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113123 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113134 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.113145 | orchestrator | 2026-01-06 00:58:39.113156 | orchestrator | TASK [mariadb : Creating mysql monitor user] *********************************** 2026-01-06 00:58:39.113167 | orchestrator | Tuesday 06 January 2026 00:58:20 +0000 (0:00:02.148) 0:03:12.584 ******* 2026-01-06 00:58:39.113178 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113189 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113200 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.113210 | orchestrator | 2026-01-06 00:58:39.113221 | orchestrator | TASK [mariadb : Creating database backup user and setting permissions] ********* 2026-01-06 00:58:39.113237 | orchestrator | Tuesday 06 January 2026 00:58:22 +0000 (0:00:02.361) 0:03:14.946 ******* 2026-01-06 00:58:39.113256 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113283 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113302 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.113320 | orchestrator | 2026-01-06 00:58:39.113338 | orchestrator | TASK [mariadb : Granting permissions on Mariabackup database to backup user] *** 2026-01-06 00:58:39.113356 | orchestrator | Tuesday 06 January 2026 00:58:24 +0000 (0:00:02.268) 0:03:17.215 ******* 2026-01-06 00:58:39.113373 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113391 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113448 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:58:39.113470 | orchestrator | 2026-01-06 00:58:39.113483 | orchestrator | TASK [service-check : mariadb | Get container facts] *************************** 2026-01-06 00:58:39.113494 | orchestrator | Tuesday 06 January 2026 00:58:27 +0000 (0:00:02.172) 0:03:19.388 ******* 2026-01-06 00:58:39.113505 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.113516 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.113526 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.113537 | orchestrator | 2026-01-06 00:58:39.113548 | orchestrator | TASK [service-check : mariadb | Fail if containers are missing or not running] *** 2026-01-06 00:58:39.113559 | orchestrator | Tuesday 06 January 2026 00:58:31 +0000 (0:00:04.634) 0:03:24.023 ******* 2026-01-06 00:58:39.113570 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.113581 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113593 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113604 | orchestrator | 2026-01-06 00:58:39.113615 | orchestrator | TASK [service-check : mariadb | Fail if containers are unhealthy] ************** 2026-01-06 00:58:39.113626 | orchestrator | Tuesday 06 January 2026 00:58:34 +0000 (0:00:02.280) 0:03:26.304 ******* 2026-01-06 00:58:39.113637 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.113648 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113659 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113670 | orchestrator | 2026-01-06 00:58:39.113681 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2026-01-06 00:58:39.113692 | orchestrator | Tuesday 06 January 2026 00:58:34 +0000 (0:00:00.505) 0:03:26.810 ******* 2026-01-06 00:58:39.113703 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:58:39.113714 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:58:39.113726 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:58:39.113737 | orchestrator | 2026-01-06 00:58:39.113748 | orchestrator | TASK [Include mariadb post-upgrade.yml] **************************************** 2026-01-06 00:58:39.113759 | orchestrator | Tuesday 06 January 2026 00:58:37 +0000 (0:00:02.458) 0:03:29.268 ******* 2026-01-06 00:58:39.113770 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:58:39.113781 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:58:39.113792 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:58:39.113803 | orchestrator | 2026-01-06 00:58:39.113814 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:58:39.113826 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-01-06 00:58:39.113839 | orchestrator | testbed-node-0 : ok=36  changed=17  unreachable=0 failed=0 skipped=39  rescued=0 ignored=1  2026-01-06 00:58:39.113851 | orchestrator | testbed-node-1 : ok=22  changed=8  unreachable=0 failed=0 skipped=45  rescued=0 ignored=1  2026-01-06 00:58:39.113871 | orchestrator | testbed-node-2 : ok=22  changed=8  unreachable=0 failed=0 skipped=45  rescued=0 ignored=1  2026-01-06 00:58:39.113882 | orchestrator | 2026-01-06 00:58:39.113894 | orchestrator | 2026-01-06 00:58:39.113914 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:58:39.113926 | orchestrator | Tuesday 06 January 2026 00:58:37 +0000 (0:00:00.518) 0:03:29.787 ******* 2026-01-06 00:58:39.113937 | orchestrator | =============================================================================== 2026-01-06 00:58:39.113958 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 38.11s 2026-01-06 00:58:39.113969 | orchestrator | mariadb : Wait for MariaDB service port liveness ----------------------- 33.26s 2026-01-06 00:58:39.113980 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 17.35s 2026-01-06 00:58:39.113992 | orchestrator | mariadb : Check MariaDB service port liveness -------------------------- 10.74s 2026-01-06 00:58:39.114003 | orchestrator | mariadb : Running MariaDB bootstrap container -------------------------- 10.52s 2026-01-06 00:58:39.114014 | orchestrator | mariadb : Starting first MariaDB container ------------------------------ 7.97s 2026-01-06 00:58:39.114059 | orchestrator | mariadb : Copying over config.json files for services ------------------- 5.02s 2026-01-06 00:58:39.114071 | orchestrator | mariadb : Copying over galera.cnf --------------------------------------- 4.93s 2026-01-06 00:58:39.114082 | orchestrator | service-check : mariadb | Get container facts --------------------------- 4.64s 2026-01-06 00:58:39.114094 | orchestrator | mariadb : Wait for MariaDB service to sync WSREP ------------------------ 4.45s 2026-01-06 00:58:39.114105 | orchestrator | service-check-containers : mariadb | Check containers ------------------- 4.26s 2026-01-06 00:58:39.114117 | orchestrator | service-check-containers : Include tasks -------------------------------- 3.21s 2026-01-06 00:58:39.114128 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 3.21s 2026-01-06 00:58:39.114139 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS key ----- 3.06s 2026-01-06 00:58:39.114150 | orchestrator | Check MariaDB service --------------------------------------------------- 2.87s 2026-01-06 00:58:39.114161 | orchestrator | mariadb : Ensuring config directories exist ----------------------------- 2.57s 2026-01-06 00:58:39.114172 | orchestrator | mariadb : Restart slave MariaDB container(s) ---------------------------- 2.50s 2026-01-06 00:58:39.114183 | orchestrator | mariadb : Wait for MariaDB service to be ready through VIP -------------- 2.46s 2026-01-06 00:58:39.114194 | orchestrator | mariadb : Wait for first MariaDB service to sync WSREP ------------------ 2.44s 2026-01-06 00:58:39.114205 | orchestrator | mariadb : Restart master MariaDB container(s) --------------------------- 2.43s 2026-01-06 00:58:39.114216 | orchestrator | 2026-01-06 00:58:39 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:39.114227 | orchestrator | 2026-01-06 00:58:39 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:39.114238 | orchestrator | 2026-01-06 00:58:39 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:39.114250 | orchestrator | 2026-01-06 00:58:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:42.165198 | orchestrator | 2026-01-06 00:58:42 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:42.166714 | orchestrator | 2026-01-06 00:58:42 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:42.170183 | orchestrator | 2026-01-06 00:58:42 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:42.170234 | orchestrator | 2026-01-06 00:58:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:45.225959 | orchestrator | 2026-01-06 00:58:45 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:45.227378 | orchestrator | 2026-01-06 00:58:45 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:45.229134 | orchestrator | 2026-01-06 00:58:45 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:45.229197 | orchestrator | 2026-01-06 00:58:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:48.279327 | orchestrator | 2026-01-06 00:58:48 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:48.281332 | orchestrator | 2026-01-06 00:58:48 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:48.282383 | orchestrator | 2026-01-06 00:58:48 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:48.282500 | orchestrator | 2026-01-06 00:58:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:51.316183 | orchestrator | 2026-01-06 00:58:51 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:51.318492 | orchestrator | 2026-01-06 00:58:51 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:51.321020 | orchestrator | 2026-01-06 00:58:51 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:51.322319 | orchestrator | 2026-01-06 00:58:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:54.359991 | orchestrator | 2026-01-06 00:58:54 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:54.361276 | orchestrator | 2026-01-06 00:58:54 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:54.361572 | orchestrator | 2026-01-06 00:58:54 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:54.361605 | orchestrator | 2026-01-06 00:58:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:58:57.405092 | orchestrator | 2026-01-06 00:58:57 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:58:57.405192 | orchestrator | 2026-01-06 00:58:57 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:58:57.407199 | orchestrator | 2026-01-06 00:58:57 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:58:57.407272 | orchestrator | 2026-01-06 00:58:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:00.455880 | orchestrator | 2026-01-06 00:59:00 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:59:00.455988 | orchestrator | 2026-01-06 00:59:00 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:00.456599 | orchestrator | 2026-01-06 00:59:00 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:00.456619 | orchestrator | 2026-01-06 00:59:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:03.489864 | orchestrator | 2026-01-06 00:59:03 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:59:03.489959 | orchestrator | 2026-01-06 00:59:03 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:03.491204 | orchestrator | 2026-01-06 00:59:03 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:03.491281 | orchestrator | 2026-01-06 00:59:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:06.524327 | orchestrator | 2026-01-06 00:59:06 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:59:06.524767 | orchestrator | 2026-01-06 00:59:06 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:06.525964 | orchestrator | 2026-01-06 00:59:06 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:06.526003 | orchestrator | 2026-01-06 00:59:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:09.569954 | orchestrator | 2026-01-06 00:59:09 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:59:09.570107 | orchestrator | 2026-01-06 00:59:09 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:09.570151 | orchestrator | 2026-01-06 00:59:09 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:09.570161 | orchestrator | 2026-01-06 00:59:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:12.605165 | orchestrator | 2026-01-06 00:59:12 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state STARTED 2026-01-06 00:59:12.607932 | orchestrator | 2026-01-06 00:59:12 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:12.609754 | orchestrator | 2026-01-06 00:59:12 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:12.609853 | orchestrator | 2026-01-06 00:59:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:15.654508 | orchestrator | 2026-01-06 00:59:15 | INFO  | Task 206b03b5-32f3-4289-a2d6-088b55aafa15 is in state SUCCESS 2026-01-06 00:59:15.655599 | orchestrator | 2026-01-06 00:59:15.655653 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-01-06 00:59:15.655675 | orchestrator | 2.16.14 2026-01-06 00:59:15.655697 | orchestrator | 2026-01-06 00:59:15.655711 | orchestrator | PLAY [Create ceph pools] ******************************************************* 2026-01-06 00:59:15.655723 | orchestrator | 2026-01-06 00:59:15.655734 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-01-06 00:59:15.656413 | orchestrator | Tuesday 06 January 2026 00:57:01 +0000 (0:00:00.619) 0:00:00.619 ******* 2026-01-06 00:59:15.656441 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:59:15.656453 | orchestrator | 2026-01-06 00:59:15.656464 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-01-06 00:59:15.656496 | orchestrator | Tuesday 06 January 2026 00:57:02 +0000 (0:00:00.690) 0:00:01.310 ******* 2026-01-06 00:59:15.656508 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656520 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656531 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656541 | orchestrator | 2026-01-06 00:59:15.656553 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-01-06 00:59:15.656563 | orchestrator | Tuesday 06 January 2026 00:57:03 +0000 (0:00:00.658) 0:00:01.969 ******* 2026-01-06 00:59:15.656574 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656585 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656596 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656607 | orchestrator | 2026-01-06 00:59:15.656618 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-01-06 00:59:15.656629 | orchestrator | Tuesday 06 January 2026 00:57:03 +0000 (0:00:00.328) 0:00:02.297 ******* 2026-01-06 00:59:15.656640 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656650 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656661 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656672 | orchestrator | 2026-01-06 00:59:15.656683 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-01-06 00:59:15.656694 | orchestrator | Tuesday 06 January 2026 00:57:04 +0000 (0:00:00.902) 0:00:03.199 ******* 2026-01-06 00:59:15.656705 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656715 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656726 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656737 | orchestrator | 2026-01-06 00:59:15.656748 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-01-06 00:59:15.656759 | orchestrator | Tuesday 06 January 2026 00:57:04 +0000 (0:00:00.328) 0:00:03.528 ******* 2026-01-06 00:59:15.656770 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656780 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656791 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656802 | orchestrator | 2026-01-06 00:59:15.656813 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-01-06 00:59:15.656850 | orchestrator | Tuesday 06 January 2026 00:57:05 +0000 (0:00:00.321) 0:00:03.849 ******* 2026-01-06 00:59:15.656862 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.656872 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.656883 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.656894 | orchestrator | 2026-01-06 00:59:15.656905 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-01-06 00:59:15.656916 | orchestrator | Tuesday 06 January 2026 00:57:05 +0000 (0:00:00.332) 0:00:04.182 ******* 2026-01-06 00:59:15.656927 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.656939 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.657034 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.657048 | orchestrator | 2026-01-06 00:59:15.657061 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-01-06 00:59:15.657073 | orchestrator | Tuesday 06 January 2026 00:57:05 +0000 (0:00:00.527) 0:00:04.709 ******* 2026-01-06 00:59:15.657087 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.657099 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.657111 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.657123 | orchestrator | 2026-01-06 00:59:15.657136 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-01-06 00:59:15.657148 | orchestrator | Tuesday 06 January 2026 00:57:06 +0000 (0:00:00.314) 0:00:05.024 ******* 2026-01-06 00:59:15.657160 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:59:15.657172 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:59:15.657185 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:59:15.657198 | orchestrator | 2026-01-06 00:59:15.657211 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-01-06 00:59:15.657224 | orchestrator | Tuesday 06 January 2026 00:57:06 +0000 (0:00:00.661) 0:00:05.685 ******* 2026-01-06 00:59:15.657236 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.657247 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.657258 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.657269 | orchestrator | 2026-01-06 00:59:15.657280 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-01-06 00:59:15.657291 | orchestrator | Tuesday 06 January 2026 00:57:07 +0000 (0:00:00.543) 0:00:06.229 ******* 2026-01-06 00:59:15.657302 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:59:15.657313 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:59:15.657324 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:59:15.657334 | orchestrator | 2026-01-06 00:59:15.657346 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-01-06 00:59:15.657356 | orchestrator | Tuesday 06 January 2026 00:57:09 +0000 (0:00:02.263) 0:00:08.493 ******* 2026-01-06 00:59:15.657367 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-01-06 00:59:15.657409 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-01-06 00:59:15.657422 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-01-06 00:59:15.657433 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.657444 | orchestrator | 2026-01-06 00:59:15.657500 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-01-06 00:59:15.657513 | orchestrator | Tuesday 06 January 2026 00:57:10 +0000 (0:00:00.670) 0:00:09.163 ******* 2026-01-06 00:59:15.657527 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657560 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657595 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657616 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.657635 | orchestrator | 2026-01-06 00:59:15.657655 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-01-06 00:59:15.657675 | orchestrator | Tuesday 06 January 2026 00:57:11 +0000 (0:00:00.816) 0:00:09.979 ******* 2026-01-06 00:59:15.657697 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657717 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657730 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.657741 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.657752 | orchestrator | 2026-01-06 00:59:15.657763 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-01-06 00:59:15.657774 | orchestrator | Tuesday 06 January 2026 00:57:11 +0000 (0:00:00.387) 0:00:10.366 ******* 2026-01-06 00:59:15.657788 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '5eea8b59f386', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-01-06 00:57:08.197499', 'end': '2026-01-06 00:57:08.232092', 'delta': '0:00:00.034593', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['5eea8b59f386'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}) 2026-01-06 00:59:15.657802 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'b98ba5363aba', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-01-06 00:57:08.999464', 'end': '2026-01-06 00:57:09.044785', 'delta': '0:00:00.045321', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['b98ba5363aba'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}) 2026-01-06 00:59:15.657864 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '4a53db93ddfe', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-01-06 00:57:09.559565', 'end': '2026-01-06 00:57:09.602207', 'delta': '0:00:00.042642', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['4a53db93ddfe'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}) 2026-01-06 00:59:15.657886 | orchestrator | 2026-01-06 00:59:15.657897 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-01-06 00:59:15.657908 | orchestrator | Tuesday 06 January 2026 00:57:11 +0000 (0:00:00.190) 0:00:10.557 ******* 2026-01-06 00:59:15.657919 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.657930 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.657941 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.657952 | orchestrator | 2026-01-06 00:59:15.657963 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-01-06 00:59:15.657973 | orchestrator | Tuesday 06 January 2026 00:57:12 +0000 (0:00:00.495) 0:00:11.052 ******* 2026-01-06 00:59:15.657984 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] 2026-01-06 00:59:15.657995 | orchestrator | 2026-01-06 00:59:15.658006 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-01-06 00:59:15.658071 | orchestrator | Tuesday 06 January 2026 00:57:15 +0000 (0:00:02.734) 0:00:13.787 ******* 2026-01-06 00:59:15.658086 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658097 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658108 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658151 | orchestrator | 2026-01-06 00:59:15.658163 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-01-06 00:59:15.658174 | orchestrator | Tuesday 06 January 2026 00:57:15 +0000 (0:00:00.307) 0:00:14.095 ******* 2026-01-06 00:59:15.658185 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658196 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658207 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658227 | orchestrator | 2026-01-06 00:59:15.658246 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-01-06 00:59:15.658266 | orchestrator | Tuesday 06 January 2026 00:57:15 +0000 (0:00:00.421) 0:00:14.516 ******* 2026-01-06 00:59:15.658285 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658304 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658321 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658339 | orchestrator | 2026-01-06 00:59:15.658359 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-01-06 00:59:15.658446 | orchestrator | Tuesday 06 January 2026 00:57:16 +0000 (0:00:00.526) 0:00:15.043 ******* 2026-01-06 00:59:15.658468 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.658480 | orchestrator | 2026-01-06 00:59:15.658491 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-01-06 00:59:15.658502 | orchestrator | Tuesday 06 January 2026 00:57:16 +0000 (0:00:00.155) 0:00:15.198 ******* 2026-01-06 00:59:15.658513 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658523 | orchestrator | 2026-01-06 00:59:15.658534 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-01-06 00:59:15.658545 | orchestrator | Tuesday 06 January 2026 00:57:16 +0000 (0:00:00.257) 0:00:15.456 ******* 2026-01-06 00:59:15.658556 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658567 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658578 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658589 | orchestrator | 2026-01-06 00:59:15.658599 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-01-06 00:59:15.658610 | orchestrator | Tuesday 06 January 2026 00:57:17 +0000 (0:00:00.296) 0:00:15.753 ******* 2026-01-06 00:59:15.658632 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658643 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658653 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658664 | orchestrator | 2026-01-06 00:59:15.658675 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-01-06 00:59:15.658686 | orchestrator | Tuesday 06 January 2026 00:57:17 +0000 (0:00:00.297) 0:00:16.050 ******* 2026-01-06 00:59:15.658697 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658708 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658718 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658729 | orchestrator | 2026-01-06 00:59:15.658740 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-01-06 00:59:15.658751 | orchestrator | Tuesday 06 January 2026 00:57:17 +0000 (0:00:00.453) 0:00:16.503 ******* 2026-01-06 00:59:15.658762 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658774 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658792 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.658810 | orchestrator | 2026-01-06 00:59:15.658829 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-01-06 00:59:15.658846 | orchestrator | Tuesday 06 January 2026 00:57:18 +0000 (0:00:00.369) 0:00:16.872 ******* 2026-01-06 00:59:15.658863 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.658878 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.658895 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.659057 | orchestrator | 2026-01-06 00:59:15.659083 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-01-06 00:59:15.659099 | orchestrator | Tuesday 06 January 2026 00:57:18 +0000 (0:00:00.341) 0:00:17.214 ******* 2026-01-06 00:59:15.659113 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.659127 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.659141 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.659208 | orchestrator | 2026-01-06 00:59:15.659228 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-01-06 00:59:15.659245 | orchestrator | Tuesday 06 January 2026 00:57:18 +0000 (0:00:00.343) 0:00:17.557 ******* 2026-01-06 00:59:15.659261 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.659277 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.659294 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.659309 | orchestrator | 2026-01-06 00:59:15.659320 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-01-06 00:59:15.659329 | orchestrator | Tuesday 06 January 2026 00:57:19 +0000 (0:00:00.501) 0:00:18.059 ******* 2026-01-06 00:59:15.659351 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382', 'dm-uuid-LVM-leNR8e0LegQCMdL6ucMKdN07fh5N5SuCAUHpjmmFqkkv8cgfcG4OQCk1bATKEOxo'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659364 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b', 'dm-uuid-LVM-hKAA9ELaJ4PXB3FsxE7aWN0Ca65H3DNcDeRaQ8myegtafvn7obDSCCodWGTEd481'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659438 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659463 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659474 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659484 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659494 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659545 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659557 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659573 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a', 'dm-uuid-LVM-LDweexZgnixRPyaZEXyjMea8qEKMICtA7IzHB9qtV3AIAvWVWiM14y0g6id7UZYZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659584 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659610 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7', 'dm-uuid-LVM-Ke0ebcxjjDzRywv3R5obBtBuMmzv68aYQAzg56kueDNDYW1ZSJhWGfYNPDa8J2Ge'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659620 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659670 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659686 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659697 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-oXsVdg-yMut-PRiK-dfGm-Pr3Q-1gzN-GvYndT', 'scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604', 'scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659717 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659729 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3zJeUG-r1bU-MUbW-4daS-IHQE-DfNT-ElqHh1', 'scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac', 'scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659741 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088', 'scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659753 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659795 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-02-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659813 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659825 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.659837 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659855 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659867 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659887 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659906 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9', 'dm-uuid-LVM-lFNjrI9z6jGvFHezfUtduDKx9CNSXgEPFaHR8oR5ZfJBhMXXuDDrOG9EnSv6tdIs'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659919 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-5K7mse-fAuc-dbI5-SiaB-plhi-xXDs-vEQzBN', 'scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585', 'scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659942 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309', 'dm-uuid-LVM-WBEZ6WMsGhewarWIW3qNudyEuUl9274MP5F99LYKaEU18gOKabMHCbX9lpi9DDDw'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659955 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9U19bv-EwEy-Ks4f-MiiE-9ta0-FWks-EZUgCO', 'scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6', 'scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.659968 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.659989 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6', 'scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660000 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660014 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-56-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660028 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.660036 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660045 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660053 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660061 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660069 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660078 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-01-06 00:59:15.660097 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660111 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zBUffM-PitN-uGRi-WCUM-hCv5-dceE-VL6GDm', 'scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5', 'scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660121 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Kf4ZcO-WYqz-GeR6-hWC5-gIYh-YPXw-Qrj6Vh', 'scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3', 'scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660130 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59', 'scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660144 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-04-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-01-06 00:59:15.660152 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.660160 | orchestrator | 2026-01-06 00:59:15.660168 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-01-06 00:59:15.660181 | orchestrator | Tuesday 06 January 2026 00:57:19 +0000 (0:00:00.469) 0:00:18.528 ******* 2026-01-06 00:59:15.660226 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382', 'dm-uuid-LVM-leNR8e0LegQCMdL6ucMKdN07fh5N5SuCAUHpjmmFqkkv8cgfcG4OQCk1bATKEOxo'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660237 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b', 'dm-uuid-LVM-hKAA9ELaJ4PXB3FsxE7aWN0Ca65H3DNcDeRaQ8myegtafvn7obDSCCodWGTEd481'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660245 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660254 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660262 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660277 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660296 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660304 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660313 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660321 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660329 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a', 'dm-uuid-LVM-LDweexZgnixRPyaZEXyjMea8qEKMICtA7IzHB9qtV3AIAvWVWiM14y0g6id7UZYZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660352 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16', 'scsi-SQEMU_QEMU_HARDDISK_47504f77-6654-4579-a6ab-2ab6ea64e907-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660366 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382-osd--block--d44b25a4--5c87--5b50--a8b5--4ed8c19ba382'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-oXsVdg-yMut-PRiK-dfGm-Pr3Q-1gzN-GvYndT', 'scsi-0QEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604', 'scsi-SQEMU_QEMU_HARDDISK_dc9d4d24-a01d-4baf-85b5-da8c88609604'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660393 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7', 'dm-uuid-LVM-Ke0ebcxjjDzRywv3R5obBtBuMmzv68aYQAzg56kueDNDYW1ZSJhWGfYNPDa8J2Ge'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660407 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--1f440738--8941--5354--ae19--38cd939f8e8b-osd--block--1f440738--8941--5354--ae19--38cd939f8e8b'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3zJeUG-r1bU-MUbW-4daS-IHQE-DfNT-ElqHh1', 'scsi-0QEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac', 'scsi-SQEMU_QEMU_HARDDISK_3d039a44-dced-4ba6-a79b-af7290a238ac'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660425 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660434 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660442 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088', 'scsi-SQEMU_QEMU_HARDDISK_d326b17f-2106-48eb-aaa2-fe8346fab088'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660451 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660459 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-02-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660473 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.660488 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660500 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660509 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660518 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660526 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660535 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9', 'dm-uuid-LVM-lFNjrI9z6jGvFHezfUtduDKx9CNSXgEPFaHR8oR5ZfJBhMXXuDDrOG9EnSv6tdIs'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660554 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a80b48fc-f175-43ec-b2c4-9074b67ccf1a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660568 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--64d6825f--3ec1--5927--8c89--e441ee427e8a-osd--block--64d6825f--3ec1--5927--8c89--e441ee427e8a'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-5K7mse-fAuc-dbI5-SiaB-plhi-xXDs-vEQzBN', 'scsi-0QEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585', 'scsi-SQEMU_QEMU_HARDDISK_724a4878-ca4e-4a20-84cd-e8427809d585'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660577 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309', 'dm-uuid-LVM-WBEZ6WMsGhewarWIW3qNudyEuUl9274MP5F99LYKaEU18gOKabMHCbX9lpi9DDDw'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660594 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--e675238b--4f6c--5157--bfd7--95a1b3a689b7-osd--block--e675238b--4f6c--5157--bfd7--95a1b3a689b7'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9U19bv-EwEy-Ks4f-MiiE-9ta0-FWks-EZUgCO', 'scsi-0QEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6', 'scsi-SQEMU_QEMU_HARDDISK_8cc5ffc1-09fb-4fde-a97f-bcebb46dacb6'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660607 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660615 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6', 'scsi-SQEMU_QEMU_HARDDISK_ea69e1b5-a504-41c3-bb3a-5961a07ea8a6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660624 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660632 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-02-56-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660640 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.660648 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660666 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660678 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660687 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660695 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660703 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660721 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16', 'scsi-SQEMU_QEMU_HARDDISK_f5c4e88c-4c87-4f6b-a240-eabfb6d80c22-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660736 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--0ba15c51--2e8d--5c95--884b--d45401cb60d9-osd--block--0ba15c51--2e8d--5c95--884b--d45401cb60d9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zBUffM-PitN-uGRi-WCUM-hCv5-dceE-VL6GDm', 'scsi-0QEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5', 'scsi-SQEMU_QEMU_HARDDISK_a9899c49-22e0-485a-be63-69bc9e218eb5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660745 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--588df21e--a0c0--57e7--8c43--2f77be274309-osd--block--588df21e--a0c0--57e7--8c43--2f77be274309'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Kf4ZcO-WYqz-GeR6-hWC5-gIYh-YPXw-Qrj6Vh', 'scsi-0QEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3', 'scsi-SQEMU_QEMU_HARDDISK_2e071fd2-3317-4a54-af1f-e9b7971267a3'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660753 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59', 'scsi-SQEMU_QEMU_HARDDISK_f5dfa6eb-99ab-4fee-90a0-8b2142cd9c59'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660772 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-01-06-00-03-04-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-01-06 00:59:15.660781 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.660789 | orchestrator | 2026-01-06 00:59:15.660797 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-01-06 00:59:15.660805 | orchestrator | Tuesday 06 January 2026 00:57:20 +0000 (0:00:00.621) 0:00:19.150 ******* 2026-01-06 00:59:15.660820 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.660828 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.660836 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.660844 | orchestrator | 2026-01-06 00:59:15.660852 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-01-06 00:59:15.660860 | orchestrator | Tuesday 06 January 2026 00:57:21 +0000 (0:00:00.661) 0:00:19.811 ******* 2026-01-06 00:59:15.660868 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.660876 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.660884 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.660891 | orchestrator | 2026-01-06 00:59:15.660899 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-01-06 00:59:15.660907 | orchestrator | Tuesday 06 January 2026 00:57:21 +0000 (0:00:00.552) 0:00:20.364 ******* 2026-01-06 00:59:15.660915 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.660923 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.660931 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.660939 | orchestrator | 2026-01-06 00:59:15.660947 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-01-06 00:59:15.660955 | orchestrator | Tuesday 06 January 2026 00:57:22 +0000 (0:00:00.706) 0:00:21.070 ******* 2026-01-06 00:59:15.660963 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.660971 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.660979 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.660987 | orchestrator | 2026-01-06 00:59:15.660995 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-01-06 00:59:15.661003 | orchestrator | Tuesday 06 January 2026 00:57:22 +0000 (0:00:00.307) 0:00:21.377 ******* 2026-01-06 00:59:15.661011 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661019 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661026 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661034 | orchestrator | 2026-01-06 00:59:15.661042 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-01-06 00:59:15.661050 | orchestrator | Tuesday 06 January 2026 00:57:23 +0000 (0:00:00.428) 0:00:21.806 ******* 2026-01-06 00:59:15.661058 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661066 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661074 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661087 | orchestrator | 2026-01-06 00:59:15.661095 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-01-06 00:59:15.661103 | orchestrator | Tuesday 06 January 2026 00:57:23 +0000 (0:00:00.561) 0:00:22.368 ******* 2026-01-06 00:59:15.661110 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-01-06 00:59:15.661118 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-01-06 00:59:15.661126 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-01-06 00:59:15.661134 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-01-06 00:59:15.661142 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-01-06 00:59:15.661150 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-01-06 00:59:15.661158 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-01-06 00:59:15.661165 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-01-06 00:59:15.661173 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-01-06 00:59:15.661181 | orchestrator | 2026-01-06 00:59:15.661189 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-01-06 00:59:15.661197 | orchestrator | Tuesday 06 January 2026 00:57:24 +0000 (0:00:00.919) 0:00:23.287 ******* 2026-01-06 00:59:15.661205 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-01-06 00:59:15.661213 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-01-06 00:59:15.661221 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-01-06 00:59:15.661228 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661236 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-01-06 00:59:15.661244 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-01-06 00:59:15.661252 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-01-06 00:59:15.661259 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661267 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-01-06 00:59:15.661275 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-01-06 00:59:15.661283 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-01-06 00:59:15.661291 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661298 | orchestrator | 2026-01-06 00:59:15.661306 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-01-06 00:59:15.661314 | orchestrator | Tuesday 06 January 2026 00:57:24 +0000 (0:00:00.367) 0:00:23.655 ******* 2026-01-06 00:59:15.661322 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 00:59:15.661330 | orchestrator | 2026-01-06 00:59:15.661338 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-01-06 00:59:15.661347 | orchestrator | Tuesday 06 January 2026 00:57:25 +0000 (0:00:00.749) 0:00:24.405 ******* 2026-01-06 00:59:15.661359 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661367 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661391 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661399 | orchestrator | 2026-01-06 00:59:15.661407 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-01-06 00:59:15.661415 | orchestrator | Tuesday 06 January 2026 00:57:25 +0000 (0:00:00.333) 0:00:24.738 ******* 2026-01-06 00:59:15.661423 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661431 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661439 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661447 | orchestrator | 2026-01-06 00:59:15.661455 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-01-06 00:59:15.661463 | orchestrator | Tuesday 06 January 2026 00:57:26 +0000 (0:00:00.326) 0:00:25.065 ******* 2026-01-06 00:59:15.661471 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661484 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.661497 | orchestrator | skipping: [testbed-node-5] 2026-01-06 00:59:15.661505 | orchestrator | 2026-01-06 00:59:15.661514 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-01-06 00:59:15.661522 | orchestrator | Tuesday 06 January 2026 00:57:26 +0000 (0:00:00.337) 0:00:25.403 ******* 2026-01-06 00:59:15.661529 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.661537 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.661545 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.661553 | orchestrator | 2026-01-06 00:59:15.661561 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-01-06 00:59:15.661569 | orchestrator | Tuesday 06 January 2026 00:57:27 +0000 (0:00:00.988) 0:00:26.391 ******* 2026-01-06 00:59:15.661577 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:59:15.661585 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:59:15.661592 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:59:15.661600 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661608 | orchestrator | 2026-01-06 00:59:15.661616 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-01-06 00:59:15.661624 | orchestrator | Tuesday 06 January 2026 00:57:28 +0000 (0:00:00.395) 0:00:26.787 ******* 2026-01-06 00:59:15.661632 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:59:15.661640 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:59:15.661648 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:59:15.661656 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661664 | orchestrator | 2026-01-06 00:59:15.661671 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-01-06 00:59:15.661679 | orchestrator | Tuesday 06 January 2026 00:57:28 +0000 (0:00:00.376) 0:00:27.164 ******* 2026-01-06 00:59:15.661687 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-01-06 00:59:15.661695 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-01-06 00:59:15.661703 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-01-06 00:59:15.661711 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.661718 | orchestrator | 2026-01-06 00:59:15.661726 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-01-06 00:59:15.661734 | orchestrator | Tuesday 06 January 2026 00:57:28 +0000 (0:00:00.401) 0:00:27.565 ******* 2026-01-06 00:59:15.661742 | orchestrator | ok: [testbed-node-3] 2026-01-06 00:59:15.661750 | orchestrator | ok: [testbed-node-4] 2026-01-06 00:59:15.661758 | orchestrator | ok: [testbed-node-5] 2026-01-06 00:59:15.661766 | orchestrator | 2026-01-06 00:59:15.661774 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-01-06 00:59:15.661782 | orchestrator | Tuesday 06 January 2026 00:57:29 +0000 (0:00:00.330) 0:00:27.896 ******* 2026-01-06 00:59:15.661789 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-01-06 00:59:15.661797 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-01-06 00:59:15.661805 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-01-06 00:59:15.661813 | orchestrator | 2026-01-06 00:59:15.661821 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-01-06 00:59:15.661829 | orchestrator | Tuesday 06 January 2026 00:57:29 +0000 (0:00:00.529) 0:00:28.425 ******* 2026-01-06 00:59:15.661837 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:59:15.661844 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:59:15.661852 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:59:15.661860 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-01-06 00:59:15.661868 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-01-06 00:59:15.661876 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-01-06 00:59:15.661890 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-01-06 00:59:15.661898 | orchestrator | 2026-01-06 00:59:15.661906 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-01-06 00:59:15.661914 | orchestrator | Tuesday 06 January 2026 00:57:30 +0000 (0:00:01.130) 0:00:29.556 ******* 2026-01-06 00:59:15.661922 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-01-06 00:59:15.661930 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-01-06 00:59:15.661938 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-01-06 00:59:15.661946 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-01-06 00:59:15.661954 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-01-06 00:59:15.661962 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-01-06 00:59:15.661973 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-01-06 00:59:15.661981 | orchestrator | 2026-01-06 00:59:15.661990 | orchestrator | TASK [Include tasks from the ceph-osd role] ************************************ 2026-01-06 00:59:15.661997 | orchestrator | Tuesday 06 January 2026 00:57:32 +0000 (0:00:02.072) 0:00:31.629 ******* 2026-01-06 00:59:15.662005 | orchestrator | skipping: [testbed-node-3] 2026-01-06 00:59:15.662088 | orchestrator | skipping: [testbed-node-4] 2026-01-06 00:59:15.662100 | orchestrator | included: /ansible/tasks/openstack_config.yml for testbed-node-5 2026-01-06 00:59:15.662108 | orchestrator | 2026-01-06 00:59:15.662116 | orchestrator | TASK [create openstack pool(s)] ************************************************ 2026-01-06 00:59:15.662124 | orchestrator | Tuesday 06 January 2026 00:57:33 +0000 (0:00:00.401) 0:00:32.030 ******* 2026-01-06 00:59:15.662139 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'backups', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:59:15.662148 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'volumes', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:59:15.662157 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'images', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:59:15.662165 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'metrics', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:59:15.662173 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'vms', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-01-06 00:59:15.662181 | orchestrator | 2026-01-06 00:59:15.662189 | orchestrator | TASK [generate keys] *********************************************************** 2026-01-06 00:59:15.662197 | orchestrator | Tuesday 06 January 2026 00:58:21 +0000 (0:00:47.893) 0:01:19.924 ******* 2026-01-06 00:59:15.662205 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662213 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662221 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662239 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662246 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662254 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662262 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] }}] 2026-01-06 00:59:15.662270 | orchestrator | 2026-01-06 00:59:15.662278 | orchestrator | TASK [get keys from monitors] ************************************************** 2026-01-06 00:59:15.662285 | orchestrator | Tuesday 06 January 2026 00:58:45 +0000 (0:00:24.000) 0:01:43.925 ******* 2026-01-06 00:59:15.662293 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662301 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662309 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662316 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662324 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662332 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662340 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-01-06 00:59:15.662347 | orchestrator | 2026-01-06 00:59:15.662355 | orchestrator | TASK [copy ceph key(s) if needed] ********************************************** 2026-01-06 00:59:15.662363 | orchestrator | Tuesday 06 January 2026 00:58:57 +0000 (0:00:12.108) 0:01:56.034 ******* 2026-01-06 00:59:15.662371 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662422 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662430 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662438 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662446 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662460 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662468 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662476 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662484 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662492 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662500 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662508 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662524 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662538 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662546 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662554 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-01-06 00:59:15.662562 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-01-06 00:59:15.662569 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-01-06 00:59:15.662578 | orchestrator | changed: [testbed-node-5 -> {{ item.1 }}] 2026-01-06 00:59:15.662585 | orchestrator | 2026-01-06 00:59:15.662593 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:59:15.662601 | orchestrator | testbed-node-3 : ok=25  changed=0 unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-01-06 00:59:15.662619 | orchestrator | testbed-node-4 : ok=18  changed=0 unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2026-01-06 00:59:15.662627 | orchestrator | testbed-node-5 : ok=23  changed=3  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2026-01-06 00:59:15.662635 | orchestrator | 2026-01-06 00:59:15.662643 | orchestrator | 2026-01-06 00:59:15.662650 | orchestrator | 2026-01-06 00:59:15.662658 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:59:15.662666 | orchestrator | Tuesday 06 January 2026 00:59:14 +0000 (0:00:17.557) 0:02:13.591 ******* 2026-01-06 00:59:15.662674 | orchestrator | =============================================================================== 2026-01-06 00:59:15.662681 | orchestrator | create openstack pool(s) ----------------------------------------------- 47.89s 2026-01-06 00:59:15.662689 | orchestrator | generate keys ---------------------------------------------------------- 24.00s 2026-01-06 00:59:15.662697 | orchestrator | copy ceph key(s) if needed --------------------------------------------- 17.56s 2026-01-06 00:59:15.662705 | orchestrator | get keys from monitors ------------------------------------------------- 12.11s 2026-01-06 00:59:15.662713 | orchestrator | ceph-facts : Get current fsid if cluster is already running ------------- 2.73s 2026-01-06 00:59:15.662720 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 2.26s 2026-01-06 00:59:15.662728 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 2.07s 2026-01-06 00:59:15.662736 | orchestrator | ceph-facts : Set_fact ceph_run_cmd -------------------------------------- 1.13s 2026-01-06 00:59:15.662744 | orchestrator | ceph-facts : Set_fact _radosgw_address to radosgw_address --------------- 0.99s 2026-01-06 00:59:15.662751 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 0.92s 2026-01-06 00:59:15.662759 | orchestrator | ceph-facts : Check if podman binary is present -------------------------- 0.90s 2026-01-06 00:59:15.662767 | orchestrator | ceph-facts : Check if the ceph mon socket is in-use --------------------- 0.82s 2026-01-06 00:59:15.662775 | orchestrator | ceph-facts : Import_tasks set_radosgw_address.yml ----------------------- 0.75s 2026-01-06 00:59:15.662783 | orchestrator | ceph-facts : Read osd pool default crush rule --------------------------- 0.71s 2026-01-06 00:59:15.662790 | orchestrator | ceph-facts : Include facts.yml ------------------------------------------ 0.69s 2026-01-06 00:59:15.662798 | orchestrator | ceph-facts : Check for a ceph mon socket -------------------------------- 0.67s 2026-01-06 00:59:15.662806 | orchestrator | ceph-facts : Set_fact monitor_name ansible_facts['hostname'] ------------ 0.66s 2026-01-06 00:59:15.662814 | orchestrator | ceph-facts : Check if the ceph conf exists ------------------------------ 0.66s 2026-01-06 00:59:15.662822 | orchestrator | ceph-facts : Check if it is atomic host --------------------------------- 0.66s 2026-01-06 00:59:15.662829 | orchestrator | ceph-facts : Set_fact devices generate device list when osd_auto_discovery --- 0.62s 2026-01-06 00:59:15.662837 | orchestrator | 2026-01-06 00:59:15 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:15.662845 | orchestrator | 2026-01-06 00:59:15 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:15.662853 | orchestrator | 2026-01-06 00:59:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:18.731282 | orchestrator | 2026-01-06 00:59:18 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:18.732900 | orchestrator | 2026-01-06 00:59:18 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:18.733729 | orchestrator | 2026-01-06 00:59:18 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:18.733791 | orchestrator | 2026-01-06 00:59:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:21.770012 | orchestrator | 2026-01-06 00:59:21 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:21.771523 | orchestrator | 2026-01-06 00:59:21 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:21.773890 | orchestrator | 2026-01-06 00:59:21 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:21.773930 | orchestrator | 2026-01-06 00:59:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:24.825340 | orchestrator | 2026-01-06 00:59:24 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:24.826642 | orchestrator | 2026-01-06 00:59:24 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:24.828555 | orchestrator | 2026-01-06 00:59:24 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:24.829534 | orchestrator | 2026-01-06 00:59:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:27.874910 | orchestrator | 2026-01-06 00:59:27 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:27.876766 | orchestrator | 2026-01-06 00:59:27 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:27.878565 | orchestrator | 2026-01-06 00:59:27 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:27.878698 | orchestrator | 2026-01-06 00:59:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:30.925494 | orchestrator | 2026-01-06 00:59:30 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:30.927231 | orchestrator | 2026-01-06 00:59:30 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:30.929918 | orchestrator | 2026-01-06 00:59:30 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:30.929966 | orchestrator | 2026-01-06 00:59:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:33.979468 | orchestrator | 2026-01-06 00:59:33 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:33.980458 | orchestrator | 2026-01-06 00:59:33 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:33.981849 | orchestrator | 2026-01-06 00:59:33 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:33.981976 | orchestrator | 2026-01-06 00:59:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:37.012039 | orchestrator | 2026-01-06 00:59:37 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:37.012883 | orchestrator | 2026-01-06 00:59:37 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:37.013930 | orchestrator | 2026-01-06 00:59:37 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:37.013987 | orchestrator | 2026-01-06 00:59:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:40.089462 | orchestrator | 2026-01-06 00:59:40 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:40.089626 | orchestrator | 2026-01-06 00:59:40 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:40.090267 | orchestrator | 2026-01-06 00:59:40 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:40.091469 | orchestrator | 2026-01-06 00:59:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:43.139608 | orchestrator | 2026-01-06 00:59:43 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:43.140781 | orchestrator | 2026-01-06 00:59:43 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:43.142709 | orchestrator | 2026-01-06 00:59:43 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state STARTED 2026-01-06 00:59:43.142931 | orchestrator | 2026-01-06 00:59:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:46.184535 | orchestrator | 2026-01-06 00:59:46 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:46.184621 | orchestrator | 2026-01-06 00:59:46 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:46.185496 | orchestrator | 2026-01-06 00:59:46 | INFO  | Task 0521d681-36c1-4027-ae34-e1cc4a7ef294 is in state SUCCESS 2026-01-06 00:59:46.189287 | orchestrator | 2026-01-06 00:59:46.189445 | orchestrator | 2026-01-06 00:59:46.189467 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 00:59:46.189481 | orchestrator | 2026-01-06 00:59:46.189492 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 00:59:46.189504 | orchestrator | Tuesday 06 January 2026 00:58:42 +0000 (0:00:00.270) 0:00:00.270 ******* 2026-01-06 00:59:46.189516 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:59:46.189528 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:59:46.189539 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:59:46.189550 | orchestrator | 2026-01-06 00:59:46.189562 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 00:59:46.190672 | orchestrator | Tuesday 06 January 2026 00:58:42 +0000 (0:00:00.310) 0:00:00.581 ******* 2026-01-06 00:59:46.190712 | orchestrator | ok: [testbed-node-0] => (item=enable_keystone_True) 2026-01-06 00:59:46.190724 | orchestrator | ok: [testbed-node-1] => (item=enable_keystone_True) 2026-01-06 00:59:46.190735 | orchestrator | ok: [testbed-node-2] => (item=enable_keystone_True) 2026-01-06 00:59:46.190746 | orchestrator | 2026-01-06 00:59:46.190757 | orchestrator | PLAY [Apply role keystone] ***************************************************** 2026-01-06 00:59:46.190768 | orchestrator | 2026-01-06 00:59:46.190778 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-01-06 00:59:46.190789 | orchestrator | Tuesday 06 January 2026 00:58:43 +0000 (0:00:00.501) 0:00:01.083 ******* 2026-01-06 00:59:46.190800 | orchestrator | included: /ansible/roles/keystone/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:59:46.190812 | orchestrator | 2026-01-06 00:59:46.190823 | orchestrator | TASK [keystone : Ensuring config directories exist] **************************** 2026-01-06 00:59:46.190834 | orchestrator | Tuesday 06 January 2026 00:58:43 +0000 (0:00:00.564) 0:00:01.647 ******* 2026-01-06 00:59:46.190851 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.190869 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.190956 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.190982 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.190997 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191011 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191033 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191066 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191086 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191108 | orchestrator | 2026-01-06 00:59:46.191123 | orchestrator | TASK [keystone : Check if policies shall be overwritten] *********************** 2026-01-06 00:59:46.191173 | orchestrator | Tuesday 06 January 2026 00:58:45 +0000 (0:00:01.998) 0:00:03.646 ******* 2026-01-06 00:59:46.191187 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.191198 | orchestrator | 2026-01-06 00:59:46.191210 | orchestrator | TASK [keystone : Set keystone policy file] ************************************* 2026-01-06 00:59:46.191221 | orchestrator | Tuesday 06 January 2026 00:58:45 +0000 (0:00:00.143) 0:00:03.790 ******* 2026-01-06 00:59:46.191232 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.191243 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.191254 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.191265 | orchestrator | 2026-01-06 00:59:46.191279 | orchestrator | TASK [keystone : Check if Keystone domain-specific config is supplied] ********* 2026-01-06 00:59:46.191299 | orchestrator | Tuesday 06 January 2026 00:58:46 +0000 (0:00:00.482) 0:00:04.272 ******* 2026-01-06 00:59:46.191312 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:59:46.191325 | orchestrator | 2026-01-06 00:59:46.191337 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-01-06 00:59:46.191378 | orchestrator | Tuesday 06 January 2026 00:58:47 +0000 (0:00:00.884) 0:00:05.156 ******* 2026-01-06 00:59:46.191392 | orchestrator | included: /ansible/roles/keystone/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 00:59:46.191404 | orchestrator | 2026-01-06 00:59:46.191416 | orchestrator | TASK [service-cert-copy : keystone | Copying over extra CA certificates] ******* 2026-01-06 00:59:46.191429 | orchestrator | Tuesday 06 January 2026 00:58:47 +0000 (0:00:00.562) 0:00:05.719 ******* 2026-01-06 00:59:46.191444 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.191468 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.191516 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.191535 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191548 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191560 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191578 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191590 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191601 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.191612 | orchestrator | 2026-01-06 00:59:46.191623 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS certificate] *** 2026-01-06 00:59:46.191634 | orchestrator | Tuesday 06 January 2026 00:58:51 +0000 (0:00:03.704) 0:00:09.424 ******* 2026-01-06 00:59:46.191683 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.191697 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.191716 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.191727 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.191739 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.191752 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.191794 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.191807 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.191824 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.191844 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.191857 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.191878 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.191897 | orchestrator | 2026-01-06 00:59:46.191916 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS key] **** 2026-01-06 00:59:46.191935 | orchestrator | Tuesday 06 January 2026 00:58:52 +0000 (0:00:00.639) 0:00:10.063 ******* 2026-01-06 00:59:46.191955 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.192036 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192062 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.192094 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.192116 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.192137 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192153 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.192164 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.192224 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.192239 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192259 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.192271 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.192282 | orchestrator | 2026-01-06 00:59:46.192293 | orchestrator | TASK [keystone : Copying over config.json files for services] ****************** 2026-01-06 00:59:46.192304 | orchestrator | Tuesday 06 January 2026 00:58:53 +0000 (0:00:00.827) 0:00:10.891 ******* 2026-01-06 00:59:46.192315 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192328 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192406 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192430 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192442 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192453 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192465 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192476 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192522 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192541 | orchestrator | 2026-01-06 00:59:46.192553 | orchestrator | TASK [keystone : Copying over keystone.conf] *********************************** 2026-01-06 00:59:46.192564 | orchestrator | Tuesday 06 January 2026 00:58:56 +0000 (0:00:03.369) 0:00:14.260 ******* 2026-01-06 00:59:46.192576 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192588 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192600 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192612 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192666 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.192681 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.192693 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192705 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192716 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.192728 | orchestrator | 2026-01-06 00:59:46.192739 | orchestrator | TASK [keystone : Copying keystone-startup script for keystone] ***************** 2026-01-06 00:59:46.192750 | orchestrator | Tuesday 06 January 2026 00:59:02 +0000 (0:00:05.759) 0:00:20.020 ******* 2026-01-06 00:59:46.192761 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:59:46.192773 | orchestrator | changed: [testbed-node-1] 2026-01-06 00:59:46.192784 | orchestrator | changed: [testbed-node-2] 2026-01-06 00:59:46.192802 | orchestrator | 2026-01-06 00:59:46.192812 | orchestrator | TASK [keystone : Create Keystone domain-specific config directory] ************* 2026-01-06 00:59:46.192823 | orchestrator | Tuesday 06 January 2026 00:59:03 +0000 (0:00:01.515) 0:00:21.535 ******* 2026-01-06 00:59:46.192834 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.192845 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.192856 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.192867 | orchestrator | 2026-01-06 00:59:46.192878 | orchestrator | TASK [keystone : Get file list in custom domains folder] *********************** 2026-01-06 00:59:46.192918 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:00.507) 0:00:22.043 ******* 2026-01-06 00:59:46.192931 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.192942 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.192953 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.192964 | orchestrator | 2026-01-06 00:59:46.192974 | orchestrator | TASK [keystone : Copying Keystone Domain specific settings] ******************** 2026-01-06 00:59:46.192985 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:00.270) 0:00:22.314 ******* 2026-01-06 00:59:46.192996 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.193007 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.193032 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.193053 | orchestrator | 2026-01-06 00:59:46.193072 | orchestrator | TASK [keystone : Copying over existing policy file] **************************** 2026-01-06 00:59:46.193092 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:00.412) 0:00:22.727 ******* 2026-01-06 00:59:46.193111 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.193135 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.193157 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.193178 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.193199 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.193257 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.193272 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.193283 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.193295 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.193306 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.193324 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.193335 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.193382 | orchestrator | 2026-01-06 00:59:46.193394 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-01-06 00:59:46.193405 | orchestrator | Tuesday 06 January 2026 00:59:05 +0000 (0:00:00.563) 0:00:23.290 ******* 2026-01-06 00:59:46.193416 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.193427 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.193438 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.193448 | orchestrator | 2026-01-06 00:59:46.193459 | orchestrator | TASK [keystone : Copying over wsgi-keystone.conf] ****************************** 2026-01-06 00:59:46.193470 | orchestrator | Tuesday 06 January 2026 00:59:05 +0000 (0:00:00.267) 0:00:23.558 ******* 2026-01-06 00:59:46.193481 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-01-06 00:59:46.193525 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-01-06 00:59:46.193538 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-01-06 00:59:46.193549 | orchestrator | 2026-01-06 00:59:46.193560 | orchestrator | TASK [keystone : Checking whether keystone-paste.ini file exists] ************** 2026-01-06 00:59:46.193571 | orchestrator | Tuesday 06 January 2026 00:59:07 +0000 (0:00:01.430) 0:00:24.989 ******* 2026-01-06 00:59:46.193581 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:59:46.193592 | orchestrator | 2026-01-06 00:59:46.193608 | orchestrator | TASK [keystone : Copying over keystone-paste.ini] ****************************** 2026-01-06 00:59:46.193620 | orchestrator | Tuesday 06 January 2026 00:59:08 +0000 (0:00:00.997) 0:00:25.986 ******* 2026-01-06 00:59:46.193631 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.193641 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.193652 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.193663 | orchestrator | 2026-01-06 00:59:46.193674 | orchestrator | TASK [keystone : Generate the required cron jobs for the node] ***************** 2026-01-06 00:59:46.193685 | orchestrator | Tuesday 06 January 2026 00:59:09 +0000 (0:00:00.949) 0:00:26.936 ******* 2026-01-06 00:59:46.193696 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 00:59:46.193706 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-01-06 00:59:46.193717 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-01-06 00:59:46.193728 | orchestrator | 2026-01-06 00:59:46.193739 | orchestrator | TASK [keystone : Set fact with the generated cron jobs for building the crontab later] *** 2026-01-06 00:59:46.193750 | orchestrator | Tuesday 06 January 2026 00:59:10 +0000 (0:00:01.291) 0:00:28.228 ******* 2026-01-06 00:59:46.193761 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:59:46.193772 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:59:46.193783 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:59:46.193794 | orchestrator | 2026-01-06 00:59:46.193804 | orchestrator | TASK [keystone : Copying files for keystone-fernet] **************************** 2026-01-06 00:59:46.193815 | orchestrator | Tuesday 06 January 2026 00:59:10 +0000 (0:00:00.316) 0:00:28.544 ******* 2026-01-06 00:59:46.193826 | orchestrator | changed: [testbed-node-1] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-01-06 00:59:46.193837 | orchestrator | changed: [testbed-node-0] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-01-06 00:59:46.193848 | orchestrator | changed: [testbed-node-2] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-01-06 00:59:46.193868 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-01-06 00:59:46.193880 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-01-06 00:59:46.193890 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-01-06 00:59:46.193902 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-01-06 00:59:46.193913 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-01-06 00:59:46.193924 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-01-06 00:59:46.193934 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-01-06 00:59:46.193945 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-01-06 00:59:46.193956 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-01-06 00:59:46.193966 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-01-06 00:59:46.193978 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-01-06 00:59:46.193988 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-01-06 00:59:46.194000 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-01-06 00:59:46.194011 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-01-06 00:59:46.194060 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-01-06 00:59:46.194071 | orchestrator | changed: [testbed-node-0] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-01-06 00:59:46.194082 | orchestrator | changed: [testbed-node-1] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-01-06 00:59:46.194093 | orchestrator | changed: [testbed-node-2] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-01-06 00:59:46.194104 | orchestrator | 2026-01-06 00:59:46.194115 | orchestrator | TASK [keystone : Copying files for keystone-ssh] ******************************* 2026-01-06 00:59:46.194126 | orchestrator | Tuesday 06 January 2026 00:59:20 +0000 (0:00:09.533) 0:00:38.078 ******* 2026-01-06 00:59:46.194137 | orchestrator | changed: [testbed-node-0] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-01-06 00:59:46.194148 | orchestrator | changed: [testbed-node-1] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-01-06 00:59:46.194159 | orchestrator | changed: [testbed-node-2] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-01-06 00:59:46.194170 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-01-06 00:59:46.194220 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-01-06 00:59:46.194239 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-01-06 00:59:46.194266 | orchestrator | 2026-01-06 00:59:46.194287 | orchestrator | TASK [service-check-containers : keystone | Check containers] ****************** 2026-01-06 00:59:46.194305 | orchestrator | Tuesday 06 January 2026 00:59:23 +0000 (0:00:02.782) 0:00:40.861 ******* 2026-01-06 00:59:46.194384 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.194423 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.194445 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-01-06 00:59:46.194475 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194503 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194531 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194543 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194555 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194566 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-01-06 00:59:46.194578 | orchestrator | 2026-01-06 00:59:46.194588 | orchestrator | TASK [service-check-containers : keystone | Notify handlers to restart containers] *** 2026-01-06 00:59:46.194599 | orchestrator | Tuesday 06 January 2026 00:59:25 +0000 (0:00:02.468) 0:00:43.330 ******* 2026-01-06 00:59:46.194610 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 00:59:46.194621 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:59:46.194632 | orchestrator | } 2026-01-06 00:59:46.194643 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 00:59:46.194654 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:59:46.194665 | orchestrator | } 2026-01-06 00:59:46.194676 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 00:59:46.194686 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 00:59:46.194697 | orchestrator | } 2026-01-06 00:59:46.194708 | orchestrator | 2026-01-06 00:59:46.194719 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 00:59:46.194730 | orchestrator | Tuesday 06 January 2026 00:59:25 +0000 (0:00:00.359) 0:00:43.689 ******* 2026-01-06 00:59:46.194758 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.194777 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.194789 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.194801 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.194813 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.194826 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.194843 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.194862 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.194880 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2025.1', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-01-06 00:59:46.194892 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2025.1', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-01-06 00:59:46.194904 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2025.1', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-01-06 00:59:46.194915 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.194926 | orchestrator | 2026-01-06 00:59:46.194937 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-01-06 00:59:46.194948 | orchestrator | Tuesday 06 January 2026 00:59:26 +0000 (0:00:01.040) 0:00:44.730 ******* 2026-01-06 00:59:46.194959 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.194970 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.194981 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.194992 | orchestrator | 2026-01-06 00:59:46.195003 | orchestrator | TASK [keystone : Creating keystone database] *********************************** 2026-01-06 00:59:46.195014 | orchestrator | Tuesday 06 January 2026 00:59:27 +0000 (0:00:00.318) 0:00:45.049 ******* 2026-01-06 00:59:46.195025 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:59:46.195036 | orchestrator | 2026-01-06 00:59:46.195046 | orchestrator | TASK [keystone : Creating Keystone database user and setting permissions] ****** 2026-01-06 00:59:46.195057 | orchestrator | Tuesday 06 January 2026 00:59:29 +0000 (0:00:02.222) 0:00:47.271 ******* 2026-01-06 00:59:46.195068 | orchestrator | changed: [testbed-node-0] 2026-01-06 00:59:46.195079 | orchestrator | 2026-01-06 00:59:46.195089 | orchestrator | TASK [keystone : Checking for any running keystone_fernet containers] ********** 2026-01-06 00:59:46.195100 | orchestrator | Tuesday 06 January 2026 00:59:31 +0000 (0:00:02.142) 0:00:49.414 ******* 2026-01-06 00:59:46.195119 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:59:46.195131 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:59:46.195142 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:59:46.195153 | orchestrator | 2026-01-06 00:59:46.195164 | orchestrator | TASK [keystone : Group nodes where keystone_fernet is running] ***************** 2026-01-06 00:59:46.195175 | orchestrator | Tuesday 06 January 2026 00:59:32 +0000 (0:00:00.956) 0:00:50.371 ******* 2026-01-06 00:59:46.195187 | orchestrator | ok: [testbed-node-0] 2026-01-06 00:59:46.195198 | orchestrator | ok: [testbed-node-1] 2026-01-06 00:59:46.195209 | orchestrator | ok: [testbed-node-2] 2026-01-06 00:59:46.195220 | orchestrator | 2026-01-06 00:59:46.195231 | orchestrator | TASK [keystone : Fail if any hosts need bootstrapping and not all hosts targeted] *** 2026-01-06 00:59:46.195242 | orchestrator | Tuesday 06 January 2026 00:59:32 +0000 (0:00:00.306) 0:00:50.677 ******* 2026-01-06 00:59:46.195252 | orchestrator | skipping: [testbed-node-0] 2026-01-06 00:59:46.195263 | orchestrator | skipping: [testbed-node-1] 2026-01-06 00:59:46.195275 | orchestrator | skipping: [testbed-node-2] 2026-01-06 00:59:46.195286 | orchestrator | 2026-01-06 00:59:46.195296 | orchestrator | TASK [keystone : Running Keystone bootstrap container] ************************* 2026-01-06 00:59:46.195313 | orchestrator | Tuesday 06 January 2026 00:59:33 +0000 (0:00:00.431) 0:00:51.108 ******* 2026-01-06 00:59:46.195549 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "Container exited with non-zero return code 1", "rc": 1, "stderr": "+ sudo -E kolla_set_configs\n2026-01-06 00:59:34.847 INFO Loading config file at /var/lib/kolla/config_files/config.json\n2026-01-06 00:59:34.847 INFO Validating config file\n2026-01-06 00:59:34.847 INFO Kolla config strategy set to: COPY_ALWAYS\n2026-01-06 00:59:34.853 INFO Copying service configuration files\n2026-01-06 00:59:34.853 INFO Copying /var/lib/kolla/config_files/keystone-startup.sh to /usr/bin/keystone-startup.sh\n2026-01-06 00:59:34.861 INFO Setting permission for /usr/bin/keystone-startup.sh\n2026-01-06 00:59:34.861 INFO Copying /var/lib/kolla/config_files/keystone.conf to /etc/keystone/keystone.conf\n2026-01-06 00:59:34.861 INFO Setting permission for /etc/keystone/keystone.conf\n2026-01-06 00:59:34.861 INFO Copying /var/lib/kolla/config_files/wsgi-keystone.conf to /etc/apache2/conf-enabled/wsgi-keystone.conf\n2026-01-06 00:59:34.871 INFO Setting permission for /etc/apache2/conf-enabled/wsgi-keystone.conf\n2026-01-06 00:59:34.872 INFO Creating directory /var/lib/kolla/share/ca-certificates\n2026-01-06 00:59:34.872 INFO Setting permission for /var/lib/kolla/share/ca-certificates\n2026-01-06 00:59:34.873 INFO Copying /var/lib/kolla/config_files/ca-certificates/testbed.crt to /var/lib/kolla/share/ca-certificates/testbed.crt\n2026-01-06 00:59:34.873 INFO Setting permission for /var/lib/kolla/share/ca-certificates/testbed.crt\n2026-01-06 00:59:34.874 INFO Writing out command to execute\n2026-01-06 00:59:34.874 INFO Setting permission for /var/log/kolla\n2026-01-06 00:59:34.875 INFO Setting permission for /etc/keystone/fernet-keys\n++ cat /run_command\n+ CMD=/usr/bin/keystone-startup.sh\n+ ARGS=\n+ sudo kolla_copy_cacerts\nrehash: warning: skipping ca-certificates.crt,it does not contain exactly one certificate or CRL\n+ sudo kolla_install_projects\n+ [[ ! -n '' ]]\n+ . kolla_extend_start\n++ KEYSTONE_LOG_DIR=/var/log/kolla/keystone\n++ [[ ! -d /var/log/kolla/keystone ]]\n++ mkdir -p /var/log/kolla/keystone\n+++ stat -c %U:%G /var/log/kolla/keystone\n++ [[ root:kolla != \\k\\e\\y\\s\\t\\o\\n\\e\\:\\k\\o\\l\\l\\a ]]\n++ chown keystone:kolla /var/log/kolla/keystone\n++ '[' '!' -f /var/log/kolla/keystone/keystone.log ']'\n++ touch /var/log/kolla/keystone/keystone.log\n+++ stat -c %U:%G /var/log/kolla/keystone/keystone.log\n++ [[ root:kolla != \\k\\e\\y\\s\\t\\o\\n\\e\\:\\k\\e\\y\\s\\t\\o\\n\\e ]]\n++ chown keystone:keystone /var/log/kolla/keystone/keystone.log\n+++ stat -c %a /var/log/kolla/keystone\n++ [[ 2755 != \\7\\5\\5 ]]\n++ chmod 755 /var/log/kolla/keystone\n++ EXTRA_KEYSTONE_MANAGE_ARGS=\n++ [[ -n '' ]]\n++ [[ -n '' ]]\n++ [[ -n 0 ]]\n++ sudo -H -u keystone keystone-manage db_sync\n2026-01-06 00:59:43.915 1081 DEBUG oslo_db.sqlalchemy.engines [-] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py:397\n2026-01-06 00:59:43.921 1081 CRITICAL keystone [-] Unhandled error: sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (1193, \"Unknown system variable 'transaction_isolation'\")\n(Background on this error at: https://sqlalche.me/e/20/e3q8)\n2026-01-06 00:59:43.921 1081 ERROR keystone Traceback (most recent call last):\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 146, in __init__\n2026-01-06 00:59:43.921 1081 ERROR keystone self._dbapi_connection = engine.raw_connection()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3298, in raw_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone return self.pool.connect()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 449, in connect\n2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionFairy._checkout(self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 1263, in _checkout\n2026-01-06 00:59:43.921 1081 ERROR keystone fairy = _ConnectionRecord.checkout(pool)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 712, in checkout\n2026-01-06 00:59:43.921 1081 ERROR keystone rec = pool._do_get()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 179, in _do_get\n2026-01-06 00:59:43.921 1081 ERROR keystone with util.safe_reraise():\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 146, in __exit__\n2026-01-06 00:59:43.921 1081 ERROR keystone raise exc_value.with_traceback(exc_tb)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 177, in _do_get\n2026-01-06 00:59:43.921 1081 ERROR keystone return self._create_connection()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 390, in _create_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionRecord(self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 674, in __init__\n2026-01-06 00:59:43.921 1081 ERROR keystone self.__connect()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 914, in __connect\n2026-01-06 00:59:43.921 1081 ERROR keystone )._exec_w_sync_on_first_run(self.dbapi_connection, self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 483, in _exec_w_sync_on_first_run\n2026-01-06 00:59:43.921 1081 ERROR keystone self(*args, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 497, in __call__\n2026-01-06 00:59:43.921 1081 ERROR keystone fn(*args, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 1916, in go\n2026-01-06 00:59:43.921 1081 ERROR keystone return once_fn(*arg, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/create.py\", line 752, in first_connect\n2026-01-06 00:59:43.921 1081 ERROR keystone dialect.initialize(c)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2898, in initialize\n2026-01-06 00:59:43.921 1081 ERROR keystone default.DefaultDialect.initialize(self, connection)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 533, in initialize\n2026-01-06 00:59:43.921 1081 ERROR keystone self.default_isolation_level = self.get_default_isolation_level(\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 584, in get_default_isolation_level\n2026-01-06 00:59:43.921 1081 ERROR keystone return self.get_isolation_level(dbapi_conn)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2603, in get_isolation_level\n2026-01-06 00:59:43.921 1081 ERROR keystone cursor.execute(\"SELECT @@transaction_isolation\")\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 153, in execute\n2026-01-06 00:59:43.921 1081 ERROR keystone result = self._query(query)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 322, in _query\n2026-01-06 00:59:43.921 1081 ERROR keystone conn.query(q)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 563, in query\n2026-01-06 00:59:43.921 1081 ERROR keystone self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 825, in _read_query_result\n2026-01-06 00:59:43.921 1081 ERROR keystone result.read()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 1199, in read\n2026-01-06 00:59:43.921 1081 ERROR keystone first_packet = self.connection._read_packet()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 775, in _read_packet\n2026-01-06 00:59:43.921 1081 ERROR keystone packet.raise_for_error()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/protocol.py\", line 219, in raise_for_error\n2026-01-06 00:59:43.921 1081 ERROR keystone err.raise_mysql_exception(self._data)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/err.py\", line 150, in raise_mysql_exception\n2026-01-06 00:59:43.921 1081 ERROR keystone raise errorclass(errno, errval)\n2026-01-06 00:59:43.921 1081 ERROR keystone pymysql.err.OperationalError: (1193, \"Unknown system variable 'transaction_isolation'\")\n2026-01-06 00:59:43.921 1081 ERROR keystone \n2026-01-06 00:59:43.921 1081 ERROR keystone The above exception was the direct cause of the following exception:\n2026-01-06 00:59:43.921 1081 ERROR keystone \n2026-01-06 00:59:43.921 1081 ERROR keystone Traceback (most recent call last):\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/bin/keystone-manage\", line 7, in \n2026-01-06 00:59:43.921 1081 ERROR keystone sys.exit(main())\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/manage.py\", line 36, in main\n2026-01-06 00:59:43.921 1081 ERROR keystone cli.main(argv=sys.argv, developer_config_file=developer_config)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/cli.py\", line 1727, in main\n2026-01-06 00:59:43.921 1081 ERROR keystone CONF.command.cmd_class.main()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/cli.py\", line 492, in main\n2026-01-06 00:59:43.921 1081 ERROR keystone upgrades.offline_sync_database_to_version(CONF.command.version)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/common/sql/upgrades.py\", line 321, in offline_sync_database_to_version\n2026-01-06 00:59:43.921 1081 ERROR keystone _db_sync(engine=engine)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/common/sql/upgrades.py\", line 210, in _db_sync\n2026-01-06 00:59:43.921 1081 ERROR keystone with sql.session_for_write() as session:\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/usr/lib/python3.12/contextlib.py\", line 137, in __enter__\n2026-01-06 00:59:43.921 1081 ERROR keystone return next(self.gen)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 1199, in _transaction_scope\n2026-01-06 00:59:43.921 1081 ERROR keystone with current._produce_block(\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/usr/lib/python3.12/contextlib.py\", line 137, in __enter__\n2026-01-06 00:59:43.921 1081 ERROR keystone return next(self.gen)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 841, in _session\n2026-01-06 00:59:43.921 1081 ERROR keystone self.session = self.factory._create_session(\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 459, in _create_session\n2026-01-06 00:59:43.921 1081 ERROR keystone self._start()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 530, in _start\n2026-01-06 00:59:43.921 1081 ERROR keystone self._setup_for_connection(\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 647, in _setup_for_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone engine = engines.create_engine(\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/debtcollector/renames.py\", line 41, in decorator\n2026-01-06 00:59:43.921 1081 ERROR keystone return wrapped(*args, **kwargs)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py\", line 271, in create_engine\n2026-01-06 00:59:43.921 1081 ERROR keystone _test_connection(engine_event_target, max_retries, retry_interval)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py\", line 169, in _test_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone conn = engine.connect()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3274, in connect\n2026-01-06 00:59:43.921 1081 ERROR keystone return self._connection_cls(self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 148, in __init__\n2026-01-06 00:59:43.921 1081 ERROR keystone Connection._handle_dbapi_exception_noconnection(\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 2436, in _handle_dbapi_exception_noconnection\n2026-01-06 00:59:43.921 1081 ERROR keystone raise newraise.with_traceback(exc_info[2]) from e\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 146, in __init__\n2026-01-06 00:59:43.921 1081 ERROR keystone self._dbapi_connection = engine.raw_connection()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3298, in raw_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone return self.pool.connect()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 449, in connect\n2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionFairy._checkout(self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 1263, in _checkout\n2026-01-06 00:59:43.921 1081 ERROR keystone fairy = _ConnectionRecord.checkout(pool)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 712, in checkout\n2026-01-06 00:59:43.921 1081 ERROR keystone rec = pool._do_get()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 179, in _do_get\n2026-01-06 00:59:43.921 1081 ERROR keystone with util.safe_reraise():\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 146, in __exit__\n2026-01-06 00:59:43.921 1081 ERROR keystone raise exc_value.with_traceback(exc_tb)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 177, in _do_get\n2026-01-06 00:59:43.921 1081 ERROR keystone return self._create_connection()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 390, in _create_connection\n2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionRecord(self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 674, in __init__\n2026-01-06 00:59:43.921 1081 ERROR keystone self.__connect()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 914, in __connect\n2026-01-06 00:59:43.921 1081 ERROR keystone )._exec_w_sync_on_first_run(self.dbapi_connection, self)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 483, in _exec_w_sync_on_first_run\n2026-01-06 00:59:43.921 1081 ERROR keystone self(*args, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 497, in __call__\n2026-01-06 00:59:43.921 1081 ERROR keystone fn(*args, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 1916, in go\n2026-01-06 00:59:43.921 1081 ERROR keystone return once_fn(*arg, **kw)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/create.py\", line 752, in first_connect\n2026-01-06 00:59:43.921 1081 ERROR keystone dialect.initialize(c)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2898, in initialize\n2026-01-06 00:59:43.921 1081 ERROR keystone default.DefaultDialect.initialize(self, connection)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 533, in initialize\n2026-01-06 00:59:43.921 1081 ERROR keystone self.default_isolation_level = self.get_default_isolation_level(\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 584, in get_default_isolation_level\n2026-01-06 00:59:43.921 1081 ERROR keystone return self.get_isolation_level(dbapi_conn)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2603, in get_isolation_level\n2026-01-06 00:59:43.921 1081 ERROR keystone cursor.execute(\"SELECT @@transaction_isolation\")\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 153, in execute\n2026-01-06 00:59:43.921 1081 ERROR keystone result = self._query(query)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 322, in _query\n2026-01-06 00:59:43.921 1081 ERROR keystone conn.query(q)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 563, in query\n2026-01-06 00:59:43.921 1081 ERROR keystone self._affected_rows = self._read_query_result(unbuffered=unbuffered)\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 825, in _read_query_result\n2026-01-06 00:59:43.921 1081 ERROR keystone result.read()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 1199, in read\n2026-01-06 00:59:43.921 1081 ERROR keystone first_packet = self.connection._read_packet()\n2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 775, in _read_packet\n2026-01-06 00:59:43.921 1081 ERROR keystone packet.raise_for_error()\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/protocol.py\", line 219, in raise_for_error\n2026-01-06 00:59:43.921 1081 ERROR keystone err.raise_mysql_exception(self._data)\n2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/err.py\", line 150, in raise_mysql_exception\n2026-01-06 00:59:43.921 1081 ERROR keystone raise errorclass(errno, errval)\n2026-01-06 00:59:43.921 1081 ERROR keystone sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (1193, \"Unknown system variable 'transaction_isolation'\")\n2026-01-06 00:59:43.921 1081 ERROR keystone (Background on this error at: https://sqlalche.me/e/20/e3q8)\n2026-01-06 00:59:43.921 1081 ERROR keystone \n", "stderr_lines": ["+ sudo -E kolla_set_configs", "2026-01-06 00:59:34.847 INFO Loading config file at /var/lib/kolla/config_files/config.json", "2026-01-06 00:59:34.847 INFO Validating config file", "2026-01-06 00:59:34.847 INFO Kolla config strategy set to: COPY_ALWAYS", "2026-01-06 00:59:34.853 INFO Copying service configuration files", "2026-01-06 00:59:34.853 INFO Copying /var/lib/kolla/config_files/keystone-startup.sh to /usr/bin/keystone-startup.sh", "2026-01-06 00:59:34.861 INFO Setting permission for /usr/bin/keystone-startup.sh", "2026-01-06 00:59:34.861 INFO Copying /var/lib/kolla/config_files/keystone.conf to /etc/keystone/keystone.conf", "2026-01-06 00:59:34.861 INFO Setting permission for /etc/keystone/keystone.conf", "2026-01-06 00:59:34.861 INFO Copying /var/lib/kolla/config_files/wsgi-keystone.conf to /etc/apache2/conf-enabled/wsgi-keystone.conf", "2026-01-06 00:59:34.871 INFO Setting permission for /etc/apache2/conf-enabled/wsgi-keystone.conf", "2026-01-06 00:59:34.872 INFO Creating directory /var/lib/kolla/share/ca-certificates", "2026-01-06 00:59:34.872 INFO Setting permission for /var/lib/kolla/share/ca-certificates", "2026-01-06 00:59:34.873 INFO Copying /var/lib/kolla/config_files/ca-certificates/testbed.crt to /var/lib/kolla/share/ca-certificates/testbed.crt", "2026-01-06 00:59:34.873 INFO Setting permission for /var/lib/kolla/share/ca-certificates/testbed.crt", "2026-01-06 00:59:34.874 INFO Writing out command to execute", "2026-01-06 00:59:34.874 INFO Setting permission for /var/log/kolla", "2026-01-06 00:59:34.875 INFO Setting permission for /etc/keystone/fernet-keys", "++ cat /run_command", "+ CMD=/usr/bin/keystone-startup.sh", "+ ARGS=", "+ sudo kolla_copy_cacerts", "rehash: warning: skipping ca-certificates.crt,it does not contain exactly one certificate or CRL", "+ sudo kolla_install_projects", "+ [[ ! -n '' ]]", "+ . kolla_extend_start", "++ KEYSTONE_LOG_DIR=/var/log/kolla/keystone", "++ [[ ! -d /var/log/kolla/keystone ]]", "++ mkdir -p /var/log/kolla/keystone", "+++ stat -c %U:%G /var/log/kolla/keystone", "++ [[ root:kolla != \\k\\e\\y\\s\\t\\o\\n\\e\\:\\k\\o\\l\\l\\a ]]", "++ chown keystone:kolla /var/log/kolla/keystone", "++ '[' '!' -f /var/log/kolla/keystone/keystone.log ']'", "++ touch /var/log/kolla/keystone/keystone.log", "+++ stat -c %U:%G /var/log/kolla/keystone/keystone.log", "++ [[ root:kolla != \\k\\e\\y\\s\\t\\o\\n\\e\\:\\k\\e\\y\\s\\t\\o\\n\\e ]]", "++ chown keystone:keystone /var/log/kolla/keystone/keystone.log", "+++ stat -c %a /var/log/kolla/keystone", "++ [[ 2755 != \\7\\5\\5 ]]", "++ chmod 755 /var/log/kolla/keystone", "++ EXTRA_KEYSTONE_MANAGE_ARGS=", "++ [[ -n '' ]]", "++ [[ -n '' ]]", "++ [[ -n 0 ]]", "++ sudo -H -u keystone keystone-manage db_sync", "2026-01-06 00:59:43.915 1081 DEBUG oslo_db.sqlalchemy.engines [-] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py:397", "2026-01-06 00:59:43.921 1081 CRITICAL keystone [-] Unhandled error: sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (1193, \"Unknown system variable 'transaction_isolation'\")", "(Background on this error at: https://sqlalche.me/e/20/e3q8)", "2026-01-06 00:59:43.921 1081 ERROR keystone Traceback (most recent call last):", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 146, in __init__", "2026-01-06 00:59:43.921 1081 ERROR keystone self._dbapi_connection = engine.raw_connection()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3298, in raw_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone return self.pool.connect()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 449, in connect", "2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionFairy._checkout(self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 1263, in _checkout", "2026-01-06 00:59:43.921 1081 ERROR keystone fairy = _ConnectionRecord.checkout(pool)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 712, in checkout", "2026-01-06 00:59:43.921 1081 ERROR keystone rec = pool._do_get()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 179, in _do_get", "2026-01-06 00:59:43.921 1081 ERROR keystone with util.safe_reraise():", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 146, in __exit__", "2026-01-06 00:59:43.921 1081 ERROR keystone raise exc_value.with_traceback(exc_tb)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 177, in _do_get", "2026-01-06 00:59:43.921 1081 ERROR keystone return self._create_connection()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 390, in _create_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionRecord(self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 674, in __init__", "2026-01-06 00:59:43.921 1081 ERROR keystone self.__connect()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 914, in __connect", "2026-01-06 00:59:43.921 1081 ERROR keystone )._exec_w_sync_on_first_run(self.dbapi_connection, self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 483, in _exec_w_sync_on_first_run", "2026-01-06 00:59:43.921 1081 ERROR keystone self(*args, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 497, in __call__", "2026-01-06 00:59:43.921 1081 ERROR keystone fn(*args, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 1916, in go", "2026-01-06 00:59:43.921 1081 ERROR keystone return once_fn(*arg, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/create.py\", line 752, in first_connect", "2026-01-06 00:59:43.921 1081 ERROR keystone dialect.initialize(c)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2898, in initialize", "2026-01-06 00:59:43.921 1081 ERROR keystone default.DefaultDialect.initialize(self, connection)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 533, in initialize", "2026-01-06 00:59:43.921 1081 ERROR keystone self.default_isolation_level = self.get_default_isolation_level(", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 584, in get_default_isolation_level", "2026-01-06 00:59:43.921 1081 ERROR keystone return self.get_isolation_level(dbapi_conn)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2603, in get_isolation_level", "2026-01-06 00:59:43.921 1081 ERROR keystone cursor.execute(\"SELECT @@transaction_isolation\")", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 153, in execute", "2026-01-06 00:59:43.921 1081 ERROR keystone result = self._query(query)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 322, in _query", "2026-01-06 00:59:43.921 1081 ERROR keystone conn.query(q)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 563, in query", "2026-01-06 00:59:43.921 1081 ERROR keystone self._affected_rows = self._read_query_result(unbuffered=unbuffered)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 825, in _read_query_result", "2026-01-06 00:59:43.921 1081 ERROR keystone result.read()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 1199, in read", "2026-01-06 00:59:43.921 1081 ERROR keystone first_packet = self.connection._read_packet()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 775, in _read_packet", "2026-01-06 00:59:43.921 1081 ERROR keystone packet.raise_for_error()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/protocol.py\", line 219, in raise_for_error", "2026-01-06 00:59:43.921 1081 ERROR keystone err.raise_mysql_exception(self._data)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/err.py\", line 150, in raise_mysql_exception", "2026-01-06 00:59:43.921 1081 ERROR keystone raise errorclass(errno, errval)", "2026-01-06 00:59:43.921 1081 ERROR keystone pymysql.err.OperationalError: (1193, \"Unknown system variable 'transaction_isolation'\")", "2026-01-06 00:59:43.921 1081 ERROR keystone ", "2026-01-06 00:59:43.921 1081 ERROR keystone The above exception was the direct cause of the following exception:", "2026-01-06 00:59:43.921 1081 ERROR keystone ", "2026-01-06 00:59:43.921 1081 ERROR keystone Traceback (most recent call last):", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/bin/keystone-manage\", line 7, in ", "2026-01-06 00:59:43.921 1081 ERROR keystone sys.exit(main())", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/manage.py\", line 36, in main", "2026-01-06 00:59:43.921 1081 ERROR keystone cli.main(argv=sys.argv, developer_config_file=developer_config)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/cli.py\", line 1727, in main", "2026-01-06 00:59:43.921 1081 ERROR keystone CONF.command.cmd_class.main()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/cmd/cli.py\", line 492, in main", "2026-01-06 00:59:43.921 1081 ERROR keystone upgrades.offline_sync_database_to_version(CONF.command.version)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/common/sql/upgrades.py\", line 321, in offline_sync_database_to_version", "2026-01-06 00:59:43.921 1081 ERROR keystone _db_sync(engine=engine)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/keystone/common/sql/upgrades.py\", line 210, in _db_sync", "2026-01-06 00:59:43.921 1081 ERROR keystone with sql.session_for_write() as session:", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/usr/lib/python3.12/contextlib.py\", line 137, in __enter__", "2026-01-06 00:59:43.921 1081 ERROR keystone return next(self.gen)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 1199, in _transaction_scope", "2026-01-06 00:59:43.921 1081 ERROR keystone with current._produce_block(", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/usr/lib/python3.12/contextlib.py\", line 137, in __enter__", "2026-01-06 00:59:43.921 1081 ERROR keystone return next(self.gen)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 841, in _session", "2026-01-06 00:59:43.921 1081 ERROR keystone self.session = self.factory._create_session(", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 459, in _create_session", "2026-01-06 00:59:43.921 1081 ERROR keystone self._start()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 530, in _start", "2026-01-06 00:59:43.921 1081 ERROR keystone self._setup_for_connection(", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/enginefacade.py\", line 647, in _setup_for_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone engine = engines.create_engine(", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/debtcollector/renames.py\", line 41, in decorator", "2026-01-06 00:59:43.921 1081 ERROR keystone return wrapped(*args, **kwargs)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py\", line 271, in create_engine", "2026-01-06 00:59:43.921 1081 ERROR keystone _test_connection(engine_event_target, max_retries, retry_interval)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/oslo_db/sqlalchemy/engines.py\", line 169, in _test_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone conn = engine.connect()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3274, in connect", "2026-01-06 00:59:43.921 1081 ERROR keystone return self._connection_cls(self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 148, in __init__", "2026-01-06 00:59:43.921 1081 ERROR keystone Connection._handle_dbapi_exception_noconnection(", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 2436, in _handle_dbapi_exception_noconnection", "2026-01-06 00:59:43.921 1081 ERROR keystone raise newraise.with_traceback(exc_info[2]) from e", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 146, in __init__", "2026-01-06 00:59:43.921 1081 ERROR keystone self._dbapi_connection = engine.raw_connection()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/base.py\", line 3298, in raw_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone return self.pool.connect()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 449, in connect", "2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionFairy._checkout(self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 1263, in _checkout", "2026-01-06 00:59:43.921 1081 ERROR keystone fairy = _ConnectionRecord.checkout(pool)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 712, in checkout", "2026-01-06 00:59:43.921 1081 ERROR keystone rec = pool._do_get()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 179, in _do_get", "2026-01-06 00:59:43.921 1081 ERROR keystone with util.safe_reraise():", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 146, in __exit__", "2026-01-06 00:59:43.921 1081 ERROR keystone raise exc_value.with_traceback(exc_tb)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/impl.py\", line 177, in _do_get", "2026-01-06 00:59:43.921 1081 ERROR keystone return self._create_connection()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 390, in _create_connection", "2026-01-06 00:59:43.921 1081 ERROR keystone return _ConnectionRecord(self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 674, in __init__", "2026-01-06 00:59:43.921 1081 ERROR keystone self.__connect()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/pool/base.py\", line 914, in __connect", "2026-01-06 00:59:43.921 1081 ERROR keystone )._exec_w_sync_on_first_run(self.dbapi_connection, self)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 483, in _exec_w_sync_on_first_run", "2026-01-06 00:59:43.921 1081 ERROR keystone self(*args, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/event/attr.py\", line 497, in __call__", "2026-01-06 00:59:43.921 1081 ERROR keystone fn(*args, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py\", line 1916, in go", "2026-01-06 00:59:43.921 1081 ERROR keystone return once_fn(*arg, **kw)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/create.py\", line 752, in first_connect", "2026-01-06 00:59:43.921 1081 ERROR keystone dialect.initialize(c)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2898, in initialize", "2026-01-06 00:59:43.921 1081 ERROR keystone default.DefaultDialect.initialize(self, connection)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 533, in initialize", "2026-01-06 00:59:43.921 1081 ERROR keystone self.default_isolation_level = self.get_default_isolation_level(", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/engine/default.py\", line 584, in get_default_isolation_level", "2026-01-06 00:59:43.921 1081 ERROR keystone return self.get_isolation_level(dbapi_conn)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/base.py\", line 2603, in get_isolation_level", "2026-01-06 00:59:43.921 1081 ERROR keystone cursor.execute(\"SELECT @@transaction_isolation\")", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 153, in execute", "2026-01-06 00:59:43.921 1081 ERROR keystone result = self._query(query)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/cursors.py\", line 322, in _query", "2026-01-06 00:59:43.921 1081 ERROR keystone conn.query(q)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 563, in query", "2026-01-06 00:59:43.921 1081 ERROR keystone self._affected_rows = self._read_query_result(unbuffered=unbuffered)", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 825, in _read_query_result", "2026-01-06 00:59:43.921 1081 ERROR keystone result.read()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 1199, in read", "2026-01-06 00:59:43.921 1081 ERROR keystone first_packet = self.connection._read_packet()", "2026-01-06 00:59:43.921 1081 ERROR keystone ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/connections.py\", line 775, in _read_packet", "2026-01-06 00:59:43.921 1081 ERROR keystone packet.raise_for_error()", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/protocol.py\", line 219, in raise_for_error", "2026-01-06 00:59:43.921 1081 ERROR keystone err.raise_mysql_exception(self._data)", "2026-01-06 00:59:43.921 1081 ERROR keystone File \"/var/lib/kolla/venv/lib/python3.12/site-packages/pymysql/err.py\", line 150, in raise_mysql_exception", "2026-01-06 00:59:43.921 1081 ERROR keystone raise errorclass(errno, errval)", "2026-01-06 00:59:43.921 1081 ERROR keystone sqlalchemy.exc.OperationalError: (pymysql.err.OperationalError) (1193, \"Unknown system variable 'transaction_isolation'\")", "2026-01-06 00:59:43.921 1081 ERROR keystone (Background on this error at: https://sqlalche.me/e/20/e3q8)", "2026-01-06 00:59:43.921 1081 ERROR keystone "], "stdout": "Updating certificates in /etc/ssl/certs...\n1 added, 0 removed; done.\nRunning hooks in /etc/ca-certificates/update.d...\ndone.\n", "stdout_lines": ["Updating certificates in /etc/ssl/certs...", "1 added, 0 removed; done.", "Running hooks in /etc/ca-certificates/update.d...", "done."]} 2026-01-06 00:59:46.195653 | orchestrator | 2026-01-06 00:59:46.195664 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 00:59:46.195676 | orchestrator | testbed-node-0 : ok=22  changed=12  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-01-06 00:59:46.195689 | orchestrator | testbed-node-1 : ok=18  changed=10  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2026-01-06 00:59:46.195701 | orchestrator | testbed-node-2 : ok=18  changed=10  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2026-01-06 00:59:46.195712 | orchestrator | 2026-01-06 00:59:46.195723 | orchestrator | 2026-01-06 00:59:46.195734 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 00:59:46.195745 | orchestrator | Tuesday 06 January 2026 00:59:44 +0000 (0:00:11.644) 0:01:02.753 ******* 2026-01-06 00:59:46.195755 | orchestrator | =============================================================================== 2026-01-06 00:59:46.195766 | orchestrator | keystone : Running Keystone bootstrap container ------------------------ 11.65s 2026-01-06 00:59:46.195777 | orchestrator | keystone : Copying files for keystone-fernet ---------------------------- 9.53s 2026-01-06 00:59:46.195788 | orchestrator | keystone : Copying over keystone.conf ----------------------------------- 5.76s 2026-01-06 00:59:46.195806 | orchestrator | service-cert-copy : keystone | Copying over extra CA certificates ------- 3.70s 2026-01-06 00:59:46.195817 | orchestrator | keystone : Copying over config.json files for services ------------------ 3.37s 2026-01-06 00:59:46.195828 | orchestrator | keystone : Copying files for keystone-ssh ------------------------------- 2.78s 2026-01-06 00:59:46.195838 | orchestrator | service-check-containers : keystone | Check containers ------------------ 2.47s 2026-01-06 00:59:46.195849 | orchestrator | keystone : Creating keystone database ----------------------------------- 2.22s 2026-01-06 00:59:46.195860 | orchestrator | keystone : Creating Keystone database user and setting permissions ------ 2.14s 2026-01-06 00:59:46.195871 | orchestrator | keystone : Ensuring config directories exist ---------------------------- 2.00s 2026-01-06 00:59:46.195882 | orchestrator | keystone : Copying keystone-startup script for keystone ----------------- 1.52s 2026-01-06 00:59:46.195892 | orchestrator | keystone : Copying over wsgi-keystone.conf ------------------------------ 1.43s 2026-01-06 00:59:46.195903 | orchestrator | keystone : Generate the required cron jobs for the node ----------------- 1.29s 2026-01-06 00:59:46.195914 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.04s 2026-01-06 00:59:46.195925 | orchestrator | keystone : Checking whether keystone-paste.ini file exists -------------- 1.00s 2026-01-06 00:59:46.195936 | orchestrator | keystone : Checking for any running keystone_fernet containers ---------- 0.96s 2026-01-06 00:59:46.195947 | orchestrator | keystone : Copying over keystone-paste.ini ------------------------------ 0.95s 2026-01-06 00:59:46.195957 | orchestrator | keystone : Check if Keystone domain-specific config is supplied --------- 0.88s 2026-01-06 00:59:46.195968 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS key ---- 0.83s 2026-01-06 00:59:46.195979 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS certificate --- 0.64s 2026-01-06 00:59:46.195990 | orchestrator | 2026-01-06 00:59:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:49.211653 | orchestrator | 2026-01-06 00:59:49 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 00:59:49.215203 | orchestrator | 2026-01-06 00:59:49 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 00:59:49.215267 | orchestrator | 2026-01-06 00:59:49 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:49.216011 | orchestrator | 2026-01-06 00:59:49 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 00:59:49.217019 | orchestrator | 2026-01-06 00:59:49 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:49.217075 | orchestrator | 2026-01-06 00:59:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:52.252483 | orchestrator | 2026-01-06 00:59:52 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 00:59:52.252749 | orchestrator | 2026-01-06 00:59:52 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 00:59:52.254822 | orchestrator | 2026-01-06 00:59:52 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state STARTED 2026-01-06 00:59:52.255796 | orchestrator | 2026-01-06 00:59:52 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 00:59:52.257592 | orchestrator | 2026-01-06 00:59:52 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:52.257650 | orchestrator | 2026-01-06 00:59:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:55.312223 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 00:59:55.313493 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 00:59:55.316431 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 00:59:55.318774 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task 918c466e-673e-4cba-b643-60211f00e649 is in state SUCCESS 2026-01-06 00:59:55.320538 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 00:59:55.322177 | orchestrator | 2026-01-06 00:59:55 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:55.322525 | orchestrator | 2026-01-06 00:59:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 00:59:58.382521 | orchestrator | 2026-01-06 00:59:58 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 00:59:58.383831 | orchestrator | 2026-01-06 00:59:58 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 00:59:58.385500 | orchestrator | 2026-01-06 00:59:58 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 00:59:58.387142 | orchestrator | 2026-01-06 00:59:58 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 00:59:58.388392 | orchestrator | 2026-01-06 00:59:58 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 00:59:58.388433 | orchestrator | 2026-01-06 00:59:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:01.438835 | orchestrator | 2026-01-06 01:00:01 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:01.439066 | orchestrator | 2026-01-06 01:00:01 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:01.440706 | orchestrator | 2026-01-06 01:00:01 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:01.440753 | orchestrator | 2026-01-06 01:00:01 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:01.441615 | orchestrator | 2026-01-06 01:00:01 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:01.441629 | orchestrator | 2026-01-06 01:00:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:04.492700 | orchestrator | 2026-01-06 01:00:04 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:04.492902 | orchestrator | 2026-01-06 01:00:04 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:04.493997 | orchestrator | 2026-01-06 01:00:04 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:04.494985 | orchestrator | 2026-01-06 01:00:04 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:04.495804 | orchestrator | 2026-01-06 01:00:04 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:04.495832 | orchestrator | 2026-01-06 01:00:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:07.536142 | orchestrator | 2026-01-06 01:00:07 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:07.536617 | orchestrator | 2026-01-06 01:00:07 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:07.537866 | orchestrator | 2026-01-06 01:00:07 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:07.538779 | orchestrator | 2026-01-06 01:00:07 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:07.539947 | orchestrator | 2026-01-06 01:00:07 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:07.540006 | orchestrator | 2026-01-06 01:00:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:10.587937 | orchestrator | 2026-01-06 01:00:10 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:10.589638 | orchestrator | 2026-01-06 01:00:10 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:10.591502 | orchestrator | 2026-01-06 01:00:10 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:10.592304 | orchestrator | 2026-01-06 01:00:10 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:10.593960 | orchestrator | 2026-01-06 01:00:10 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:10.594066 | orchestrator | 2026-01-06 01:00:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:13.641371 | orchestrator | 2026-01-06 01:00:13 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:13.642808 | orchestrator | 2026-01-06 01:00:13 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:13.644518 | orchestrator | 2026-01-06 01:00:13 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:13.645887 | orchestrator | 2026-01-06 01:00:13 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:13.647004 | orchestrator | 2026-01-06 01:00:13 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:13.647042 | orchestrator | 2026-01-06 01:00:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:16.694131 | orchestrator | 2026-01-06 01:00:16 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:16.694567 | orchestrator | 2026-01-06 01:00:16 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:16.696645 | orchestrator | 2026-01-06 01:00:16 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:16.698072 | orchestrator | 2026-01-06 01:00:16 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:16.700187 | orchestrator | 2026-01-06 01:00:16 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:16.700221 | orchestrator | 2026-01-06 01:00:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:19.751267 | orchestrator | 2026-01-06 01:00:19 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:19.754241 | orchestrator | 2026-01-06 01:00:19 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:19.754308 | orchestrator | 2026-01-06 01:00:19 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:19.754340 | orchestrator | 2026-01-06 01:00:19 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:19.755571 | orchestrator | 2026-01-06 01:00:19 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:19.755594 | orchestrator | 2026-01-06 01:00:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:22.797546 | orchestrator | 2026-01-06 01:00:22 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:22.798373 | orchestrator | 2026-01-06 01:00:22 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:22.799837 | orchestrator | 2026-01-06 01:00:22 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:22.801903 | orchestrator | 2026-01-06 01:00:22 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:22.803070 | orchestrator | 2026-01-06 01:00:22 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:22.803246 | orchestrator | 2026-01-06 01:00:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:25.843053 | orchestrator | 2026-01-06 01:00:25 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:25.844036 | orchestrator | 2026-01-06 01:00:25 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:25.844568 | orchestrator | 2026-01-06 01:00:25 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:25.847147 | orchestrator | 2026-01-06 01:00:25 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:25.849015 | orchestrator | 2026-01-06 01:00:25 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:25.849061 | orchestrator | 2026-01-06 01:00:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:28.897498 | orchestrator | 2026-01-06 01:00:28 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:28.898610 | orchestrator | 2026-01-06 01:00:28 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:28.900615 | orchestrator | 2026-01-06 01:00:28 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:28.902065 | orchestrator | 2026-01-06 01:00:28 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:28.903781 | orchestrator | 2026-01-06 01:00:28 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:28.903817 | orchestrator | 2026-01-06 01:00:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:31.951686 | orchestrator | 2026-01-06 01:00:31 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:31.953294 | orchestrator | 2026-01-06 01:00:31 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:31.955427 | orchestrator | 2026-01-06 01:00:31 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:31.957797 | orchestrator | 2026-01-06 01:00:31 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:31.960054 | orchestrator | 2026-01-06 01:00:31 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:31.960096 | orchestrator | 2026-01-06 01:00:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:35.009952 | orchestrator | 2026-01-06 01:00:35 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:35.010842 | orchestrator | 2026-01-06 01:00:35 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:35.012773 | orchestrator | 2026-01-06 01:00:35 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:35.013843 | orchestrator | 2026-01-06 01:00:35 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:35.015124 | orchestrator | 2026-01-06 01:00:35 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:35.015174 | orchestrator | 2026-01-06 01:00:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:38.062410 | orchestrator | 2026-01-06 01:00:38 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:38.064136 | orchestrator | 2026-01-06 01:00:38 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:38.066915 | orchestrator | 2026-01-06 01:00:38 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:38.069608 | orchestrator | 2026-01-06 01:00:38 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:38.071762 | orchestrator | 2026-01-06 01:00:38 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state STARTED 2026-01-06 01:00:38.071836 | orchestrator | 2026-01-06 01:00:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:41.118639 | orchestrator | 2026-01-06 01:00:41 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:41.119145 | orchestrator | 2026-01-06 01:00:41 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:41.120396 | orchestrator | 2026-01-06 01:00:41 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:41.121369 | orchestrator | 2026-01-06 01:00:41 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:41.123109 | orchestrator | 2026-01-06 01:00:41 | INFO  | Task 1ffaf769-3f6e-43b0-98b2-4a3f68d02dd0 is in state SUCCESS 2026-01-06 01:00:41.126663 | orchestrator | 2026-01-06 01:00:41.126728 | orchestrator | 2026-01-06 01:00:41.126750 | orchestrator | PLAY [Copy ceph keys to the configuration repository] ************************** 2026-01-06 01:00:41.126767 | orchestrator | 2026-01-06 01:00:41.126778 | orchestrator | TASK [Check if ceph keys exist] ************************************************ 2026-01-06 01:00:41.126791 | orchestrator | Tuesday 06 January 2026 00:59:19 +0000 (0:00:00.178) 0:00:00.178 ******* 2026-01-06 01:00:41.126802 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-01-06 01:00:41.126837 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.126850 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.126861 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-01-06 01:00:41.126872 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127471 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-01-06 01:00:41.127503 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-01-06 01:00:41.127520 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-01-06 01:00:41.127538 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-01-06 01:00:41.127556 | orchestrator | 2026-01-06 01:00:41.127575 | orchestrator | TASK [Fetch all ceph keys] ***************************************************** 2026-01-06 01:00:41.127594 | orchestrator | Tuesday 06 January 2026 00:59:24 +0000 (0:00:04.488) 0:00:04.667 ******* 2026-01-06 01:00:41.127612 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-01-06 01:00:41.127631 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127649 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127668 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-01-06 01:00:41.127685 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127703 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-01-06 01:00:41.127721 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-01-06 01:00:41.127770 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-01-06 01:00:41.127789 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-01-06 01:00:41.127807 | orchestrator | 2026-01-06 01:00:41.127825 | orchestrator | TASK [Create share directory] ************************************************** 2026-01-06 01:00:41.127842 | orchestrator | Tuesday 06 January 2026 00:59:28 +0000 (0:00:04.492) 0:00:09.160 ******* 2026-01-06 01:00:41.127860 | orchestrator | changed: [testbed-manager -> localhost] 2026-01-06 01:00:41.127879 | orchestrator | 2026-01-06 01:00:41.127897 | orchestrator | TASK [Write ceph keys to the share directory] ********************************** 2026-01-06 01:00:41.127915 | orchestrator | Tuesday 06 January 2026 00:59:30 +0000 (0:00:01.074) 0:00:10.235 ******* 2026-01-06 01:00:41.127933 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.admin.keyring) 2026-01-06 01:00:41.127951 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127970 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.127988 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder-backup.keyring) 2026-01-06 01:00:41.128007 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.128024 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.nova.keyring) 2026-01-06 01:00:41.128044 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.glance.keyring) 2026-01-06 01:00:41.128062 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.gnocchi.keyring) 2026-01-06 01:00:41.128081 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.manila.keyring) 2026-01-06 01:00:41.128101 | orchestrator | 2026-01-06 01:00:41.128120 | orchestrator | TASK [Check if target directories exist] *************************************** 2026-01-06 01:00:41.128141 | orchestrator | Tuesday 06 January 2026 00:59:43 +0000 (0:00:13.513) 0:00:23.748 ******* 2026-01-06 01:00:41.128160 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/infrastructure/files/ceph) 2026-01-06 01:00:41.128178 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-volume) 2026-01-06 01:00:41.128197 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-01-06 01:00:41.128215 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-01-06 01:00:41.128255 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-01-06 01:00:41.128275 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-01-06 01:00:41.128346 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/glance) 2026-01-06 01:00:41.128367 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/gnocchi) 2026-01-06 01:00:41.128402 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/manila) 2026-01-06 01:00:41.128421 | orchestrator | 2026-01-06 01:00:41.128439 | orchestrator | TASK [Write ceph keys to the configuration directory] ************************** 2026-01-06 01:00:41.128457 | orchestrator | Tuesday 06 January 2026 00:59:46 +0000 (0:00:02.740) 0:00:26.489 ******* 2026-01-06 01:00:41.128477 | orchestrator | changed: [testbed-manager] => (item=ceph.client.admin.keyring) 2026-01-06 01:00:41.128496 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.128514 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.128533 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder-backup.keyring) 2026-01-06 01:00:41.128545 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-01-06 01:00:41.128571 | orchestrator | changed: [testbed-manager] => (item=ceph.client.nova.keyring) 2026-01-06 01:00:41.128583 | orchestrator | changed: [testbed-manager] => (item=ceph.client.glance.keyring) 2026-01-06 01:00:41.128594 | orchestrator | changed: [testbed-manager] => (item=ceph.client.gnocchi.keyring) 2026-01-06 01:00:41.128604 | orchestrator | changed: [testbed-manager] => (item=ceph.client.manila.keyring) 2026-01-06 01:00:41.128615 | orchestrator | 2026-01-06 01:00:41.128714 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:00:41.128741 | orchestrator | testbed-manager : ok=6  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:00:41.128763 | orchestrator | 2026-01-06 01:00:41.128781 | orchestrator | 2026-01-06 01:00:41.128800 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:00:41.128818 | orchestrator | Tuesday 06 January 2026 00:59:52 +0000 (0:00:06.000) 0:00:32.489 ******* 2026-01-06 01:00:41.128836 | orchestrator | =============================================================================== 2026-01-06 01:00:41.128855 | orchestrator | Write ceph keys to the share directory --------------------------------- 13.51s 2026-01-06 01:00:41.128874 | orchestrator | Write ceph keys to the configuration directory -------------------------- 6.00s 2026-01-06 01:00:41.128892 | orchestrator | Fetch all ceph keys ----------------------------------------------------- 4.49s 2026-01-06 01:00:41.128912 | orchestrator | Check if ceph keys exist ------------------------------------------------ 4.49s 2026-01-06 01:00:41.128929 | orchestrator | Check if target directories exist --------------------------------------- 2.74s 2026-01-06 01:00:41.128946 | orchestrator | Create share directory -------------------------------------------------- 1.07s 2026-01-06 01:00:41.128963 | orchestrator | 2026-01-06 01:00:41.128981 | orchestrator | 2026-01-06 01:00:41.129000 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:00:41.129018 | orchestrator | 2026-01-06 01:00:41.129037 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:00:41.129055 | orchestrator | Tuesday 06 January 2026 00:58:42 +0000 (0:00:00.274) 0:00:00.274 ******* 2026-01-06 01:00:41.129074 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.129095 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.129113 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.129132 | orchestrator | 2026-01-06 01:00:41.129149 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:00:41.129166 | orchestrator | Tuesday 06 January 2026 00:58:42 +0000 (0:00:00.304) 0:00:00.578 ******* 2026-01-06 01:00:41.129186 | orchestrator | ok: [testbed-node-0] => (item=enable_horizon_True) 2026-01-06 01:00:41.129206 | orchestrator | ok: [testbed-node-1] => (item=enable_horizon_True) 2026-01-06 01:00:41.129225 | orchestrator | ok: [testbed-node-2] => (item=enable_horizon_True) 2026-01-06 01:00:41.129244 | orchestrator | 2026-01-06 01:00:41.129263 | orchestrator | PLAY [Apply role horizon] ****************************************************** 2026-01-06 01:00:41.129280 | orchestrator | 2026-01-06 01:00:41.129386 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-01-06 01:00:41.129407 | orchestrator | Tuesday 06 January 2026 00:58:43 +0000 (0:00:00.489) 0:00:01.067 ******* 2026-01-06 01:00:41.129427 | orchestrator | included: /ansible/roles/horizon/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:00:41.129446 | orchestrator | 2026-01-06 01:00:41.129462 | orchestrator | TASK [horizon : Ensuring config directories exist] ***************************** 2026-01-06 01:00:41.129479 | orchestrator | Tuesday 06 January 2026 00:58:43 +0000 (0:00:00.523) 0:00:01.591 ******* 2026-01-06 01:00:41.129550 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.129595 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.129653 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.129669 | orchestrator | 2026-01-06 01:00:41.129681 | orchestrator | TASK [horizon : Set empty custom policy] *************************************** 2026-01-06 01:00:41.129692 | orchestrator | Tuesday 06 January 2026 00:58:45 +0000 (0:00:01.339) 0:00:02.930 ******* 2026-01-06 01:00:41.129703 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.129714 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.129725 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.129739 | orchestrator | 2026-01-06 01:00:41.129758 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-01-06 01:00:41.129776 | orchestrator | Tuesday 06 January 2026 00:58:45 +0000 (0:00:00.571) 0:00:03.502 ******* 2026-01-06 01:00:41.129794 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'cloudkitty', 'enabled': False})  2026-01-06 01:00:41.129813 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'heat', 'enabled': 'no'})  2026-01-06 01:00:41.129830 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'ironic', 'enabled': False})  2026-01-06 01:00:41.129849 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'masakari', 'enabled': False})  2026-01-06 01:00:41.129866 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'mistral', 'enabled': False})  2026-01-06 01:00:41.129883 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'tacker', 'enabled': False})  2026-01-06 01:00:41.129899 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'trove', 'enabled': False})  2026-01-06 01:00:41.129915 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'watcher', 'enabled': False})  2026-01-06 01:00:41.129931 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'cloudkitty', 'enabled': False})  2026-01-06 01:00:41.129947 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'heat', 'enabled': 'no'})  2026-01-06 01:00:41.129974 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'ironic', 'enabled': False})  2026-01-06 01:00:41.129991 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'masakari', 'enabled': False})  2026-01-06 01:00:41.130007 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'mistral', 'enabled': False})  2026-01-06 01:00:41.130094 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'tacker', 'enabled': False})  2026-01-06 01:00:41.130110 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'trove', 'enabled': False})  2026-01-06 01:00:41.130127 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'watcher', 'enabled': False})  2026-01-06 01:00:41.130143 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'cloudkitty', 'enabled': False})  2026-01-06 01:00:41.130160 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'heat', 'enabled': 'no'})  2026-01-06 01:00:41.130178 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'ironic', 'enabled': False})  2026-01-06 01:00:41.130194 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'masakari', 'enabled': False})  2026-01-06 01:00:41.130224 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'mistral', 'enabled': False})  2026-01-06 01:00:41.130241 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'tacker', 'enabled': False})  2026-01-06 01:00:41.130256 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'trove', 'enabled': False})  2026-01-06 01:00:41.130273 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'watcher', 'enabled': False})  2026-01-06 01:00:41.130315 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'ceilometer', 'enabled': 'yes'}) 2026-01-06 01:00:41.130335 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'cinder', 'enabled': 'yes'}) 2026-01-06 01:00:41.130353 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'designate', 'enabled': True}) 2026-01-06 01:00:41.130371 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'glance', 'enabled': True}) 2026-01-06 01:00:41.130389 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'keystone', 'enabled': True}) 2026-01-06 01:00:41.130407 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'magnum', 'enabled': True}) 2026-01-06 01:00:41.130424 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'manila', 'enabled': True}) 2026-01-06 01:00:41.130442 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'neutron', 'enabled': True}) 2026-01-06 01:00:41.130603 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'nova', 'enabled': True}) 2026-01-06 01:00:41.130646 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'octavia', 'enabled': True}) 2026-01-06 01:00:41.130662 | orchestrator | 2026-01-06 01:00:41.130677 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.130694 | orchestrator | Tuesday 06 January 2026 00:58:46 +0000 (0:00:00.724) 0:00:04.227 ******* 2026-01-06 01:00:41.130712 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.130729 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.130745 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.130762 | orchestrator | 2026-01-06 01:00:41.130778 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.130807 | orchestrator | Tuesday 06 January 2026 00:58:46 +0000 (0:00:00.315) 0:00:04.542 ******* 2026-01-06 01:00:41.130818 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.130828 | orchestrator | 2026-01-06 01:00:41.130837 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.130847 | orchestrator | Tuesday 06 January 2026 00:58:46 +0000 (0:00:00.130) 0:00:04.673 ******* 2026-01-06 01:00:41.130857 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.130867 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.130877 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.130886 | orchestrator | 2026-01-06 01:00:41.130896 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.130905 | orchestrator | Tuesday 06 January 2026 00:58:47 +0000 (0:00:00.516) 0:00:05.189 ******* 2026-01-06 01:00:41.130915 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.130925 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.130935 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.130944 | orchestrator | 2026-01-06 01:00:41.130954 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.130964 | orchestrator | Tuesday 06 January 2026 00:58:47 +0000 (0:00:00.332) 0:00:05.521 ******* 2026-01-06 01:00:41.130974 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.130983 | orchestrator | 2026-01-06 01:00:41.130993 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131003 | orchestrator | Tuesday 06 January 2026 00:58:47 +0000 (0:00:00.135) 0:00:05.657 ******* 2026-01-06 01:00:41.131013 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131022 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.131032 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.131042 | orchestrator | 2026-01-06 01:00:41.131051 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.131061 | orchestrator | Tuesday 06 January 2026 00:58:48 +0000 (0:00:00.328) 0:00:05.986 ******* 2026-01-06 01:00:41.131071 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.131081 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.131093 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.131110 | orchestrator | 2026-01-06 01:00:41.131126 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.131142 | orchestrator | Tuesday 06 January 2026 00:58:48 +0000 (0:00:00.372) 0:00:06.359 ******* 2026-01-06 01:00:41.131157 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131174 | orchestrator | 2026-01-06 01:00:41.131191 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131207 | orchestrator | Tuesday 06 January 2026 00:58:48 +0000 (0:00:00.381) 0:00:06.741 ******* 2026-01-06 01:00:41.131235 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131246 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.131256 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.131265 | orchestrator | 2026-01-06 01:00:41.131275 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.131285 | orchestrator | Tuesday 06 January 2026 00:58:49 +0000 (0:00:00.351) 0:00:07.093 ******* 2026-01-06 01:00:41.131324 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.131341 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.131357 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.131372 | orchestrator | 2026-01-06 01:00:41.131397 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.131413 | orchestrator | Tuesday 06 January 2026 00:58:49 +0000 (0:00:00.366) 0:00:07.459 ******* 2026-01-06 01:00:41.131429 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131446 | orchestrator | 2026-01-06 01:00:41.131462 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131479 | orchestrator | Tuesday 06 January 2026 00:58:49 +0000 (0:00:00.146) 0:00:07.606 ******* 2026-01-06 01:00:41.131510 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131528 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.131545 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.131556 | orchestrator | 2026-01-06 01:00:41.131565 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.131575 | orchestrator | Tuesday 06 January 2026 00:58:50 +0000 (0:00:00.305) 0:00:07.911 ******* 2026-01-06 01:00:41.131585 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.131595 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.131605 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.131614 | orchestrator | 2026-01-06 01:00:41.131624 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.131634 | orchestrator | Tuesday 06 January 2026 00:58:50 +0000 (0:00:00.554) 0:00:08.466 ******* 2026-01-06 01:00:41.131644 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131653 | orchestrator | 2026-01-06 01:00:41.131663 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131673 | orchestrator | Tuesday 06 January 2026 00:58:50 +0000 (0:00:00.153) 0:00:08.619 ******* 2026-01-06 01:00:41.131682 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131692 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.131702 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.131711 | orchestrator | 2026-01-06 01:00:41.131721 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.131731 | orchestrator | Tuesday 06 January 2026 00:58:51 +0000 (0:00:00.314) 0:00:08.934 ******* 2026-01-06 01:00:41.131741 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.131751 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.131760 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.131770 | orchestrator | 2026-01-06 01:00:41.131780 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.131790 | orchestrator | Tuesday 06 January 2026 00:58:51 +0000 (0:00:00.375) 0:00:09.309 ******* 2026-01-06 01:00:41.131799 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131809 | orchestrator | 2026-01-06 01:00:41.131818 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131828 | orchestrator | Tuesday 06 January 2026 00:58:51 +0000 (0:00:00.141) 0:00:09.450 ******* 2026-01-06 01:00:41.131838 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131847 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.131857 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.131867 | orchestrator | 2026-01-06 01:00:41.131876 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.131886 | orchestrator | Tuesday 06 January 2026 00:58:51 +0000 (0:00:00.311) 0:00:09.762 ******* 2026-01-06 01:00:41.131896 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.131905 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.131915 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.131925 | orchestrator | 2026-01-06 01:00:41.131934 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.131944 | orchestrator | Tuesday 06 January 2026 00:58:52 +0000 (0:00:00.598) 0:00:10.361 ******* 2026-01-06 01:00:41.131953 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.131963 | orchestrator | 2026-01-06 01:00:41.131973 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.131985 | orchestrator | Tuesday 06 January 2026 00:58:52 +0000 (0:00:00.126) 0:00:10.487 ******* 2026-01-06 01:00:41.132001 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132018 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.132040 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.132061 | orchestrator | 2026-01-06 01:00:41.132077 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.132092 | orchestrator | Tuesday 06 January 2026 00:58:52 +0000 (0:00:00.312) 0:00:10.800 ******* 2026-01-06 01:00:41.132109 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.132136 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.132146 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.132156 | orchestrator | 2026-01-06 01:00:41.132166 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.132175 | orchestrator | Tuesday 06 January 2026 00:58:53 +0000 (0:00:00.364) 0:00:11.164 ******* 2026-01-06 01:00:41.132185 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132195 | orchestrator | 2026-01-06 01:00:41.132204 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.132214 | orchestrator | Tuesday 06 January 2026 00:58:53 +0000 (0:00:00.147) 0:00:11.311 ******* 2026-01-06 01:00:41.132224 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132233 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.132243 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.132253 | orchestrator | 2026-01-06 01:00:41.132263 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.132273 | orchestrator | Tuesday 06 January 2026 00:58:53 +0000 (0:00:00.298) 0:00:11.609 ******* 2026-01-06 01:00:41.132283 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.132317 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.132327 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.132337 | orchestrator | 2026-01-06 01:00:41.132356 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.132367 | orchestrator | Tuesday 06 January 2026 00:58:54 +0000 (0:00:00.653) 0:00:12.263 ******* 2026-01-06 01:00:41.132376 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132386 | orchestrator | 2026-01-06 01:00:41.132396 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.132406 | orchestrator | Tuesday 06 January 2026 00:58:54 +0000 (0:00:00.144) 0:00:12.408 ******* 2026-01-06 01:00:41.132415 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132425 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.132435 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.132444 | orchestrator | 2026-01-06 01:00:41.132461 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-01-06 01:00:41.132471 | orchestrator | Tuesday 06 January 2026 00:58:54 +0000 (0:00:00.317) 0:00:12.726 ******* 2026-01-06 01:00:41.132481 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:41.132491 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:41.132501 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:41.132510 | orchestrator | 2026-01-06 01:00:41.132520 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-01-06 01:00:41.132530 | orchestrator | Tuesday 06 January 2026 00:58:55 +0000 (0:00:00.341) 0:00:13.067 ******* 2026-01-06 01:00:41.132539 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132549 | orchestrator | 2026-01-06 01:00:41.132559 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-01-06 01:00:41.132568 | orchestrator | Tuesday 06 January 2026 00:58:55 +0000 (0:00:00.150) 0:00:13.218 ******* 2026-01-06 01:00:41.132578 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132588 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.132598 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.132607 | orchestrator | 2026-01-06 01:00:41.132617 | orchestrator | TASK [horizon : Copying over config.json files for services] ******************* 2026-01-06 01:00:41.132626 | orchestrator | Tuesday 06 January 2026 00:58:55 +0000 (0:00:00.535) 0:00:13.753 ******* 2026-01-06 01:00:41.132636 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:00:41.132646 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:00:41.132655 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:00:41.132665 | orchestrator | 2026-01-06 01:00:41.132675 | orchestrator | TASK [horizon : Copying over horizon.conf] ************************************* 2026-01-06 01:00:41.132684 | orchestrator | Tuesday 06 January 2026 00:58:57 +0000 (0:00:01.780) 0:00:15.534 ******* 2026-01-06 01:00:41.132694 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-01-06 01:00:41.132716 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-01-06 01:00:41.132740 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-01-06 01:00:41.132760 | orchestrator | 2026-01-06 01:00:41.132775 | orchestrator | TASK [horizon : Copying over kolla-settings.py] ******************************** 2026-01-06 01:00:41.132792 | orchestrator | Tuesday 06 January 2026 00:58:59 +0000 (0:00:01.962) 0:00:17.496 ******* 2026-01-06 01:00:41.132808 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-01-06 01:00:41.132824 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-01-06 01:00:41.132840 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-01-06 01:00:41.132855 | orchestrator | 2026-01-06 01:00:41.132869 | orchestrator | TASK [horizon : Copying over custom-settings.py] ******************************* 2026-01-06 01:00:41.132885 | orchestrator | Tuesday 06 January 2026 00:59:02 +0000 (0:00:02.503) 0:00:20.000 ******* 2026-01-06 01:00:41.132901 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-01-06 01:00:41.132919 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-01-06 01:00:41.132930 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-01-06 01:00:41.132940 | orchestrator | 2026-01-06 01:00:41.132950 | orchestrator | TASK [horizon : Copying over existing policy file] ***************************** 2026-01-06 01:00:41.132960 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:01.980) 0:00:21.981 ******* 2026-01-06 01:00:41.132969 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.132984 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133006 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133026 | orchestrator | 2026-01-06 01:00:41.133042 | orchestrator | TASK [horizon : Copying over custom themes] ************************************ 2026-01-06 01:00:41.133058 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:00.255) 0:00:22.237 ******* 2026-01-06 01:00:41.133075 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.133091 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133101 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133111 | orchestrator | 2026-01-06 01:00:41.133121 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-01-06 01:00:41.133131 | orchestrator | Tuesday 06 January 2026 00:59:04 +0000 (0:00:00.259) 0:00:22.497 ******* 2026-01-06 01:00:41.133141 | orchestrator | included: /ansible/roles/horizon/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:00:41.133151 | orchestrator | 2026-01-06 01:00:41.133160 | orchestrator | TASK [service-cert-copy : horizon | Copying over extra CA certificates] ******** 2026-01-06 01:00:41.133170 | orchestrator | Tuesday 06 January 2026 00:59:05 +0000 (0:00:00.671) 0:00:23.168 ******* 2026-01-06 01:00:41.133204 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133236 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133254 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133272 | orchestrator | 2026-01-06 01:00:41.133282 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS certificate] *** 2026-01-06 01:00:41.133329 | orchestrator | Tuesday 06 January 2026 00:59:06 +0000 (0:00:01.409) 0:00:24.578 ******* 2026-01-06 01:00:41.133356 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133376 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.133387 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133398 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133444 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133454 | orchestrator | 2026-01-06 01:00:41.133464 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS key] ***** 2026-01-06 01:00:41.133474 | orchestrator | Tuesday 06 January 2026 00:59:07 +0000 (0:00:00.694) 0:00:25.272 ******* 2026-01-06 01:00:41.133485 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133495 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.133519 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133537 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133548 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133558 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133568 | orchestrator | 2026-01-06 01:00:41.133578 | orchestrator | TASK [service-check-containers : horizon | Check containers] ******************* 2026-01-06 01:00:41.133587 | orchestrator | Tuesday 06 January 2026 00:59:08 +0000 (0:00:00.864) 0:00:26.136 ******* 2026-01-06 01:00:41.133611 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133635 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133657 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-01-06 01:00:41.133669 | orchestrator | 2026-01-06 01:00:41.133679 | orchestrator | TASK [service-check-containers : horizon | Notify handlers to restart containers] *** 2026-01-06 01:00:41.133689 | orchestrator | Tuesday 06 January 2026 00:59:10 +0000 (0:00:01.853) 0:00:27.990 ******* 2026-01-06 01:00:41.133699 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 01:00:41.133709 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:00:41.133719 | orchestrator | } 2026-01-06 01:00:41.133729 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 01:00:41.133738 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:00:41.133748 | orchestrator | } 2026-01-06 01:00:41.133758 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 01:00:41.133767 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:00:41.133777 | orchestrator | } 2026-01-06 01:00:41.133787 | orchestrator | 2026-01-06 01:00:41.133797 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 01:00:41.133807 | orchestrator | Tuesday 06 January 2026 00:59:10 +0000 (0:00:00.374) 0:00:28.364 ******* 2026-01-06 01:00:41.133830 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133848 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.133859 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133875 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133898 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2025.1', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-01-06 01:00:41.133910 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133919 | orchestrator | 2026-01-06 01:00:41.133929 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-01-06 01:00:41.133939 | orchestrator | Tuesday 06 January 2026 00:59:11 +0000 (0:00:00.928) 0:00:29.293 ******* 2026-01-06 01:00:41.133949 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:00:41.133959 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:00:41.133969 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:00:41.133984 | orchestrator | 2026-01-06 01:00:41.134003 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-01-06 01:00:41.134097 | orchestrator | Tuesday 06 January 2026 00:59:11 +0000 (0:00:00.532) 0:00:29.825 ******* 2026-01-06 01:00:41.134121 | orchestrator | included: /ansible/roles/horizon/tasks/bootstrap.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:00:41.134140 | orchestrator | 2026-01-06 01:00:41.134157 | orchestrator | TASK [horizon : Creating Horizon database] ************************************* 2026-01-06 01:00:41.134174 | orchestrator | Tuesday 06 January 2026 00:59:12 +0000 (0:00:00.595) 0:00:30.421 ******* 2026-01-06 01:00:41.134190 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:00:41.134207 | orchestrator | 2026-01-06 01:00:41.134224 | orchestrator | TASK [horizon : Creating Horizon database user and setting permissions] ******** 2026-01-06 01:00:41.134240 | orchestrator | Tuesday 06 January 2026 00:59:14 +0000 (0:00:02.355) 0:00:32.777 ******* 2026-01-06 01:00:41.134265 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:00:41.134284 | orchestrator | 2026-01-06 01:00:41.134326 | orchestrator | TASK [horizon : Running Horizon bootstrap container] *************************** 2026-01-06 01:00:41.134343 | orchestrator | Tuesday 06 January 2026 00:59:17 +0000 (0:00:02.192) 0:00:34.969 ******* 2026-01-06 01:00:41.134374 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:00:41.134390 | orchestrator | 2026-01-06 01:00:41.134405 | orchestrator | TASK [horizon : Flush handlers] ************************************************ 2026-01-06 01:00:41.134420 | orchestrator | Tuesday 06 January 2026 00:59:34 +0000 (0:00:17.280) 0:00:52.250 ******* 2026-01-06 01:00:41.134435 | orchestrator | 2026-01-06 01:00:41.134452 | orchestrator | TASK [horizon : Flush handlers] ************************************************ 2026-01-06 01:00:41.134469 | orchestrator | Tuesday 06 January 2026 00:59:34 +0000 (0:00:00.060) 0:00:52.311 ******* 2026-01-06 01:00:41.134484 | orchestrator | 2026-01-06 01:00:41.134500 | orchestrator | TASK [horizon : Flush handlers] ************************************************ 2026-01-06 01:00:41.134517 | orchestrator | Tuesday 06 January 2026 00:59:34 +0000 (0:00:00.173) 0:00:52.485 ******* 2026-01-06 01:00:41.134534 | orchestrator | 2026-01-06 01:00:41.134552 | orchestrator | RUNNING HANDLER [horizon : Restart horizon container] ************************** 2026-01-06 01:00:41.134568 | orchestrator | Tuesday 06 January 2026 00:59:34 +0000 (0:00:00.061) 0:00:52.546 ******* 2026-01-06 01:00:41.134584 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:00:41.134601 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:00:41.134615 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:00:41.134631 | orchestrator | 2026-01-06 01:00:41.134646 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:00:41.134664 | orchestrator | testbed-node-0 : ok=38  changed=12  unreachable=0 failed=0 skipped=26  rescued=0 ignored=0 2026-01-06 01:00:41.134680 | orchestrator | testbed-node-1 : ok=35  changed=9  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-01-06 01:00:41.134708 | orchestrator | testbed-node-2 : ok=35  changed=9  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-01-06 01:00:41.134723 | orchestrator | 2026-01-06 01:00:41.134739 | orchestrator | 2026-01-06 01:00:41.134754 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:00:41.134770 | orchestrator | Tuesday 06 January 2026 01:00:40 +0000 (0:01:05.544) 0:01:58.091 ******* 2026-01-06 01:00:41.134787 | orchestrator | =============================================================================== 2026-01-06 01:00:41.134803 | orchestrator | horizon : Restart horizon container ------------------------------------ 65.54s 2026-01-06 01:00:41.134830 | orchestrator | horizon : Running Horizon bootstrap container -------------------------- 17.28s 2026-01-06 01:00:41.134847 | orchestrator | horizon : Copying over kolla-settings.py -------------------------------- 2.50s 2026-01-06 01:00:41.134863 | orchestrator | horizon : Creating Horizon database ------------------------------------- 2.36s 2026-01-06 01:00:41.134879 | orchestrator | horizon : Creating Horizon database user and setting permissions -------- 2.19s 2026-01-06 01:00:41.134895 | orchestrator | horizon : Copying over custom-settings.py ------------------------------- 1.98s 2026-01-06 01:00:41.134910 | orchestrator | horizon : Copying over horizon.conf ------------------------------------- 1.96s 2026-01-06 01:00:41.134926 | orchestrator | service-check-containers : horizon | Check containers ------------------- 1.85s 2026-01-06 01:00:41.134942 | orchestrator | horizon : Copying over config.json files for services ------------------- 1.78s 2026-01-06 01:00:41.134958 | orchestrator | service-cert-copy : horizon | Copying over extra CA certificates -------- 1.41s 2026-01-06 01:00:41.134974 | orchestrator | horizon : Ensuring config directories exist ----------------------------- 1.34s 2026-01-06 01:00:41.134989 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.93s 2026-01-06 01:00:41.135005 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS key ----- 0.86s 2026-01-06 01:00:41.135023 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.72s 2026-01-06 01:00:41.135040 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS certificate --- 0.69s 2026-01-06 01:00:41.135057 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.67s 2026-01-06 01:00:41.135088 | orchestrator | horizon : Update policy file name --------------------------------------- 0.65s 2026-01-06 01:00:41.135104 | orchestrator | horizon : Update policy file name --------------------------------------- 0.60s 2026-01-06 01:00:41.135120 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.60s 2026-01-06 01:00:41.135135 | orchestrator | horizon : Set empty custom policy --------------------------------------- 0.57s 2026-01-06 01:00:41.135150 | orchestrator | 2026-01-06 01:00:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:44.175731 | orchestrator | 2026-01-06 01:00:44 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:44.177992 | orchestrator | 2026-01-06 01:00:44 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:44.180529 | orchestrator | 2026-01-06 01:00:44 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:44.182857 | orchestrator | 2026-01-06 01:00:44 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:44.184323 | orchestrator | 2026-01-06 01:00:44 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:44.184488 | orchestrator | 2026-01-06 01:00:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:47.241063 | orchestrator | 2026-01-06 01:00:47 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:47.242532 | orchestrator | 2026-01-06 01:00:47 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:47.245021 | orchestrator | 2026-01-06 01:00:47 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:47.246218 | orchestrator | 2026-01-06 01:00:47 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:47.248362 | orchestrator | 2026-01-06 01:00:47 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:47.248388 | orchestrator | 2026-01-06 01:00:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:50.304014 | orchestrator | 2026-01-06 01:00:50 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:50.304368 | orchestrator | 2026-01-06 01:00:50 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:50.304900 | orchestrator | 2026-01-06 01:00:50 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:50.306006 | orchestrator | 2026-01-06 01:00:50 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:50.306956 | orchestrator | 2026-01-06 01:00:50 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:50.307249 | orchestrator | 2026-01-06 01:00:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:53.352796 | orchestrator | 2026-01-06 01:00:53 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:53.354002 | orchestrator | 2026-01-06 01:00:53 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state STARTED 2026-01-06 01:00:53.356419 | orchestrator | 2026-01-06 01:00:53 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:53.358520 | orchestrator | 2026-01-06 01:00:53 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state STARTED 2026-01-06 01:00:53.360524 | orchestrator | 2026-01-06 01:00:53 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:53.360555 | orchestrator | 2026-01-06 01:00:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:56.409637 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:00:56.409783 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state STARTED 2026-01-06 01:00:56.411345 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task ca69c0b7-8695-491f-959e-72a8327b75c6 is in state SUCCESS 2026-01-06 01:00:56.412534 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:56.413692 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task be494a1a-3ac9-419f-978f-a977e9e7aa98 is in state SUCCESS 2026-01-06 01:00:56.414686 | orchestrator | 2026-01-06 01:00:56.414739 | orchestrator | 2026-01-06 01:00:56.414757 | orchestrator | PLAY [Apply role cephclient] *************************************************** 2026-01-06 01:00:56.414776 | orchestrator | 2026-01-06 01:00:56.414794 | orchestrator | TASK [osism.services.cephclient : Include container tasks] ********************* 2026-01-06 01:00:56.414812 | orchestrator | Tuesday 06 January 2026 00:59:57 +0000 (0:00:00.287) 0:00:00.287 ******* 2026-01-06 01:00:56.414830 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/cephclient/tasks/container.yml for testbed-manager 2026-01-06 01:00:56.414848 | orchestrator | 2026-01-06 01:00:56.414865 | orchestrator | TASK [osism.services.cephclient : Create required directories] ***************** 2026-01-06 01:00:56.414883 | orchestrator | Tuesday 06 January 2026 00:59:57 +0000 (0:00:00.238) 0:00:00.525 ******* 2026-01-06 01:00:56.414901 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/configuration) 2026-01-06 01:00:56.414918 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/data) 2026-01-06 01:00:56.414936 | orchestrator | ok: [testbed-manager] => (item=/opt/cephclient) 2026-01-06 01:00:56.414954 | orchestrator | 2026-01-06 01:00:56.414970 | orchestrator | TASK [osism.services.cephclient : Copy configuration files] ******************** 2026-01-06 01:00:56.414987 | orchestrator | Tuesday 06 January 2026 00:59:59 +0000 (0:00:01.388) 0:00:01.913 ******* 2026-01-06 01:00:56.415005 | orchestrator | changed: [testbed-manager] => (item={'src': 'ceph.conf.j2', 'dest': '/opt/cephclient/configuration/ceph.conf'}) 2026-01-06 01:00:56.415023 | orchestrator | 2026-01-06 01:00:56.415040 | orchestrator | TASK [osism.services.cephclient : Copy keyring file] *************************** 2026-01-06 01:00:56.415057 | orchestrator | Tuesday 06 January 2026 01:00:00 +0000 (0:00:01.559) 0:00:03.472 ******* 2026-01-06 01:00:56.415074 | orchestrator | changed: [testbed-manager] 2026-01-06 01:00:56.415090 | orchestrator | 2026-01-06 01:00:56.415107 | orchestrator | TASK [osism.services.cephclient : Copy docker-compose.yml file] **************** 2026-01-06 01:00:56.415424 | orchestrator | Tuesday 06 January 2026 01:00:01 +0000 (0:00:00.892) 0:00:04.365 ******* 2026-01-06 01:00:56.415454 | orchestrator | changed: [testbed-manager] 2026-01-06 01:00:56.415472 | orchestrator | 2026-01-06 01:00:56.415490 | orchestrator | TASK [osism.services.cephclient : Manage cephclient service] ******************* 2026-01-06 01:00:56.415508 | orchestrator | Tuesday 06 January 2026 01:00:02 +0000 (0:00:00.876) 0:00:05.241 ******* 2026-01-06 01:00:56.415526 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage cephclient service (10 retries left). 2026-01-06 01:00:56.415543 | orchestrator | ok: [testbed-manager] 2026-01-06 01:00:56.415560 | orchestrator | 2026-01-06 01:00:56.415577 | orchestrator | TASK [osism.services.cephclient : Copy wrapper scripts] ************************ 2026-01-06 01:00:56.415594 | orchestrator | Tuesday 06 January 2026 01:00:44 +0000 (0:00:42.006) 0:00:47.248 ******* 2026-01-06 01:00:56.415610 | orchestrator | changed: [testbed-manager] => (item=ceph) 2026-01-06 01:00:56.415628 | orchestrator | changed: [testbed-manager] => (item=ceph-authtool) 2026-01-06 01:00:56.415645 | orchestrator | changed: [testbed-manager] => (item=rados) 2026-01-06 01:00:56.415662 | orchestrator | changed: [testbed-manager] => (item=radosgw-admin) 2026-01-06 01:00:56.415679 | orchestrator | changed: [testbed-manager] => (item=rbd) 2026-01-06 01:00:56.415696 | orchestrator | 2026-01-06 01:00:56.415712 | orchestrator | TASK [osism.services.cephclient : Remove old wrapper scripts] ****************** 2026-01-06 01:00:56.415765 | orchestrator | Tuesday 06 January 2026 01:00:48 +0000 (0:00:04.132) 0:00:51.380 ******* 2026-01-06 01:00:56.415781 | orchestrator | ok: [testbed-manager] => (item=crushtool) 2026-01-06 01:00:56.415798 | orchestrator | 2026-01-06 01:00:56.415816 | orchestrator | TASK [osism.services.cephclient : Include package tasks] *********************** 2026-01-06 01:00:56.415834 | orchestrator | Tuesday 06 January 2026 01:00:48 +0000 (0:00:00.486) 0:00:51.867 ******* 2026-01-06 01:00:56.415851 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:00:56.415869 | orchestrator | 2026-01-06 01:00:56.415887 | orchestrator | TASK [osism.services.cephclient : Include rook task] *************************** 2026-01-06 01:00:56.415907 | orchestrator | Tuesday 06 January 2026 01:00:49 +0000 (0:00:00.148) 0:00:52.016 ******* 2026-01-06 01:00:56.415918 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:00:56.415929 | orchestrator | 2026-01-06 01:00:56.415940 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Restart cephclient service] ******* 2026-01-06 01:00:56.415951 | orchestrator | Tuesday 06 January 2026 01:00:49 +0000 (0:00:00.563) 0:00:52.579 ******* 2026-01-06 01:00:56.415962 | orchestrator | changed: [testbed-manager] 2026-01-06 01:00:56.415973 | orchestrator | 2026-01-06 01:00:56.415984 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Ensure that all containers are up] *** 2026-01-06 01:00:56.415994 | orchestrator | Tuesday 06 January 2026 01:00:51 +0000 (0:00:01.408) 0:00:53.987 ******* 2026-01-06 01:00:56.416005 | orchestrator | changed: [testbed-manager] 2026-01-06 01:00:56.416016 | orchestrator | 2026-01-06 01:00:56.416042 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Wait for an healthy service] ****** 2026-01-06 01:00:56.416056 | orchestrator | Tuesday 06 January 2026 01:00:51 +0000 (0:00:00.846) 0:00:54.834 ******* 2026-01-06 01:00:56.416068 | orchestrator | changed: [testbed-manager] 2026-01-06 01:00:56.416082 | orchestrator | 2026-01-06 01:00:56.416095 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Copy bash completion scripts] ***** 2026-01-06 01:00:56.416107 | orchestrator | Tuesday 06 January 2026 01:00:52 +0000 (0:00:00.604) 0:00:55.439 ******* 2026-01-06 01:00:56.416120 | orchestrator | ok: [testbed-manager] => (item=ceph) 2026-01-06 01:00:56.416133 | orchestrator | ok: [testbed-manager] => (item=rados) 2026-01-06 01:00:56.416146 | orchestrator | ok: [testbed-manager] => (item=radosgw-admin) 2026-01-06 01:00:56.416159 | orchestrator | ok: [testbed-manager] => (item=rbd) 2026-01-06 01:00:56.416173 | orchestrator | 2026-01-06 01:00:56.416185 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:00:56.416200 | orchestrator | testbed-manager : ok=12  changed=8  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-01-06 01:00:56.416214 | orchestrator | 2026-01-06 01:00:56.416227 | orchestrator | 2026-01-06 01:00:56.416259 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:00:56.416272 | orchestrator | Tuesday 06 January 2026 01:00:54 +0000 (0:00:01.550) 0:00:56.989 ******* 2026-01-06 01:00:56.416395 | orchestrator | =============================================================================== 2026-01-06 01:00:56.416415 | orchestrator | osism.services.cephclient : Manage cephclient service ------------------ 42.01s 2026-01-06 01:00:56.416432 | orchestrator | osism.services.cephclient : Copy wrapper scripts ------------------------ 4.13s 2026-01-06 01:00:56.416451 | orchestrator | osism.services.cephclient : Copy configuration files -------------------- 1.56s 2026-01-06 01:00:56.416468 | orchestrator | osism.services.cephclient : Copy bash completion scripts ---------------- 1.55s 2026-01-06 01:00:56.416487 | orchestrator | osism.services.cephclient : Restart cephclient service ------------------ 1.41s 2026-01-06 01:00:56.416504 | orchestrator | osism.services.cephclient : Create required directories ----------------- 1.39s 2026-01-06 01:00:56.416520 | orchestrator | osism.services.cephclient : Copy keyring file --------------------------- 0.89s 2026-01-06 01:00:56.416537 | orchestrator | osism.services.cephclient : Copy docker-compose.yml file ---------------- 0.88s 2026-01-06 01:00:56.416553 | orchestrator | osism.services.cephclient : Ensure that all containers are up ----------- 0.85s 2026-01-06 01:00:56.416585 | orchestrator | osism.services.cephclient : Wait for an healthy service ----------------- 0.60s 2026-01-06 01:00:56.416604 | orchestrator | osism.services.cephclient : Include rook task --------------------------- 0.56s 2026-01-06 01:00:56.416621 | orchestrator | osism.services.cephclient : Remove old wrapper scripts ------------------ 0.49s 2026-01-06 01:00:56.416641 | orchestrator | osism.services.cephclient : Include container tasks --------------------- 0.24s 2026-01-06 01:00:56.416660 | orchestrator | osism.services.cephclient : Include package tasks ----------------------- 0.15s 2026-01-06 01:00:56.416679 | orchestrator | 2026-01-06 01:00:56.416693 | orchestrator | 2026-01-06 01:00:56.416704 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:00:56.416715 | orchestrator | 2026-01-06 01:00:56.416726 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:00:56.416736 | orchestrator | Tuesday 06 January 2026 00:59:49 +0000 (0:00:00.347) 0:00:00.347 ******* 2026-01-06 01:00:56.416747 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:56.416758 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:56.416769 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:56.416780 | orchestrator | 2026-01-06 01:00:56.416792 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:00:56.416803 | orchestrator | Tuesday 06 January 2026 00:59:49 +0000 (0:00:00.323) 0:00:00.671 ******* 2026-01-06 01:00:56.416814 | orchestrator | ok: [testbed-node-0] => (item=enable_designate_True) 2026-01-06 01:00:56.416825 | orchestrator | ok: [testbed-node-1] => (item=enable_designate_True) 2026-01-06 01:00:56.416836 | orchestrator | ok: [testbed-node-2] => (item=enable_designate_True) 2026-01-06 01:00:56.416847 | orchestrator | 2026-01-06 01:00:56.416857 | orchestrator | PLAY [Apply role designate] **************************************************** 2026-01-06 01:00:56.416868 | orchestrator | 2026-01-06 01:00:56.416879 | orchestrator | TASK [designate : include_tasks] *********************************************** 2026-01-06 01:00:56.416890 | orchestrator | Tuesday 06 January 2026 00:59:50 +0000 (0:00:00.473) 0:00:01.144 ******* 2026-01-06 01:00:56.416901 | orchestrator | included: /ansible/roles/designate/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:00:56.416913 | orchestrator | 2026-01-06 01:00:56.416924 | orchestrator | TASK [service-ks-register : designate | Creating/deleting services] ************ 2026-01-06 01:00:56.416933 | orchestrator | Tuesday 06 January 2026 00:59:50 +0000 (0:00:00.574) 0:00:01.719 ******* 2026-01-06 01:00:56.416943 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (5 retries left). 2026-01-06 01:00:56.416953 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (4 retries left). 2026-01-06 01:00:56.416962 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (3 retries left). 2026-01-06 01:00:56.416972 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (2 retries left). 2026-01-06 01:00:56.416981 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (1 retries left). 2026-01-06 01:00:56.417036 | orchestrator | failed: [testbed-node-0] (item=designate (dns)) => {"action": "openstack.cloud.catalog_service", "ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Designate DNS Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9001"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9001"}], "name": "designate", "type": "dns"}, "module_stderr": "Failed to discover available identity versions when contacting https://api-int.testbed.osism.xyz:5000. Attempting to parse version from URL.\nTraceback (most recent call last):\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 136, in _do_create_plugin\n disc = self.get_discovery(\n ^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 703, in get_discovery\n return discover.get_discovery(\n ^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 1742, in get_discovery\n disc = Discover(session, url, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 585, in __init__\n self._data = get_version_data(\n ^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 114, in get_version_data\n resp = session.get(url, headers=headers, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1320, in get\n return self.request(url, 'GET', **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1118, in request\n raise exceptions.from_response(resp, method, url)\nkeystoneauth1.exceptions.http.ServiceUnavailable: Service Unavailable (HTTP 503)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/tmp/ansible-tmp-1767661254.5654275-3361-104178616451387/AnsiballZ_catalog_service.py\", line 107, in \n _ansiballz_main()\n File \"/tmp/ansible-tmp-1767661254.5654275-3361-104178616451387/AnsiballZ_catalog_service.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/tmp/ansible-tmp-1767661254.5654275-3361-104178616451387/AnsiballZ_catalog_service.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.openstack.cloud.plugins.modules.catalog_service', init_globals=dict(_module_fqn='ansible_collections.openstack.cloud.plugins.modules.catalog_service', _modlib_path=modlib_path),\n File \"\", line 226, in run_module\n File \"\", line 98, in _run_module_code\n File \"\", line 88, in _run_code\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_cz8n8qhn/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 211, in \n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_cz8n8qhn/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 207, in main\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_cz8n8qhn/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/module_utils/openstack.py\", line 417, in __call__\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_cz8n8qhn/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 113, in run\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_cz8n8qhn/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 175, in _find\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 91, in __get__\n proxy = self._make_proxy(instance)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 289, in _make_proxy\n found_version = temp_adapter.get_api_major_version()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/adapter.py\", line 403, in get_api_major_version\n return self.session.get_api_major_version(auth or self.auth, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1478, in get_api_major_version\n return auth.get_api_major_version(self, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 573, in get_api_major_version\n data = get_endpoint_data(discover_versions=discover_versions)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 296, in get_endpoint_data\n service_catalog = self.get_access(session).service_catalog\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 139, in get_access\n self.auth_ref = self.get_auth_ref(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 221, in get_auth_ref\n plugin = self._do_create_plugin(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 163, in _do_create_plugin\n raise exceptions.DiscoveryFailure(\nkeystoneauth1.exceptions.discovery.DiscoveryFailure: Could not find versioned identity endpoints when attempting to authenticate. Please check that your auth_url is correct. Service Unavailable (HTTP 503)\n", "module_stdout": "", "msg": "MODULE FAILURE: No start of json char found\nSee stdout/stderr for the exact error", "rc": 1} 2026-01-06 01:00:56.417069 | orchestrator | 2026-01-06 01:00:56.417080 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:00:56.417090 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-01-06 01:00:56.417100 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:00:56.417111 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:00:56.417121 | orchestrator | 2026-01-06 01:00:56.417131 | orchestrator | 2026-01-06 01:00:56.417141 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:00:56.417151 | orchestrator | Tuesday 06 January 2026 01:00:55 +0000 (0:01:04.951) 0:01:06.670 ******* 2026-01-06 01:00:56.417161 | orchestrator | =============================================================================== 2026-01-06 01:00:56.417171 | orchestrator | service-ks-register : designate | Creating/deleting services ----------- 64.95s 2026-01-06 01:00:56.417180 | orchestrator | designate : include_tasks ----------------------------------------------- 0.57s 2026-01-06 01:00:56.417190 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.47s 2026-01-06 01:00:56.417200 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.32s 2026-01-06 01:00:56.417210 | orchestrator | 2026-01-06 01:00:56 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:56.417220 | orchestrator | 2026-01-06 01:00:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:00:59.470695 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:00:59.470939 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task e0be9c97-ba93-4002-9818-fb5df445a77d is in state SUCCESS 2026-01-06 01:00:59.471025 | orchestrator | 2026-01-06 01:00:59.473366 | orchestrator | 2026-01-06 01:00:59.473427 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:00:59.473441 | orchestrator | 2026-01-06 01:00:59.473453 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:00:59.473465 | orchestrator | Tuesday 06 January 2026 00:59:49 +0000 (0:00:00.312) 0:00:00.312 ******* 2026-01-06 01:00:59.473477 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:00:59.473489 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:00:59.473500 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:00:59.473512 | orchestrator | 2026-01-06 01:00:59.473551 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:00:59.473579 | orchestrator | Tuesday 06 January 2026 00:59:49 +0000 (0:00:00.396) 0:00:00.709 ******* 2026-01-06 01:00:59.473591 | orchestrator | ok: [testbed-node-0] => (item=enable_barbican_True) 2026-01-06 01:00:59.473602 | orchestrator | ok: [testbed-node-1] => (item=enable_barbican_True) 2026-01-06 01:00:59.473613 | orchestrator | ok: [testbed-node-2] => (item=enable_barbican_True) 2026-01-06 01:00:59.473625 | orchestrator | 2026-01-06 01:00:59.473636 | orchestrator | PLAY [Apply role barbican] ***************************************************** 2026-01-06 01:00:59.473647 | orchestrator | 2026-01-06 01:00:59.473658 | orchestrator | TASK [barbican : include_tasks] ************************************************ 2026-01-06 01:00:59.473669 | orchestrator | Tuesday 06 January 2026 00:59:50 +0000 (0:00:00.530) 0:00:01.239 ******* 2026-01-06 01:00:59.473680 | orchestrator | included: /ansible/roles/barbican/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:00:59.473692 | orchestrator | 2026-01-06 01:00:59.473703 | orchestrator | TASK [service-ks-register : barbican | Creating/deleting services] ************* 2026-01-06 01:00:59.473714 | orchestrator | Tuesday 06 January 2026 00:59:51 +0000 (0:00:00.607) 0:00:01.847 ******* 2026-01-06 01:00:59.473726 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (5 retries left). 2026-01-06 01:00:59.473737 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (4 retries left). 2026-01-06 01:00:59.473748 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (3 retries left). 2026-01-06 01:00:59.473759 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (2 retries left). 2026-01-06 01:00:59.473770 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (1 retries left). 2026-01-06 01:00:59.473824 | orchestrator | failed: [testbed-node-0] (item=barbican (key-manager)) => {"action": "openstack.cloud.catalog_service", "ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Barbican Key Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9311"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9311"}], "name": "barbican", "type": "key-manager"}, "module_stderr": "Failed to discover available identity versions when contacting https://api-int.testbed.osism.xyz:5000. Attempting to parse version from URL.\nTraceback (most recent call last):\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 136, in _do_create_plugin\n disc = self.get_discovery(\n ^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 703, in get_discovery\n return discover.get_discovery(\n ^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 1742, in get_discovery\n disc = Discover(session, url, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 585, in __init__\n self._data = get_version_data(\n ^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 114, in get_version_data\n resp = session.get(url, headers=headers, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1320, in get\n return self.request(url, 'GET', **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1118, in request\n raise exceptions.from_response(resp, method, url)\nkeystoneauth1.exceptions.http.ServiceUnavailable: Service Unavailable (HTTP 503)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/tmp/ansible-tmp-1767661254.7447593-3373-213731077079813/AnsiballZ_catalog_service.py\", line 107, in \n _ansiballz_main()\n File \"/tmp/ansible-tmp-1767661254.7447593-3373-213731077079813/AnsiballZ_catalog_service.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/tmp/ansible-tmp-1767661254.7447593-3373-213731077079813/AnsiballZ_catalog_service.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.openstack.cloud.plugins.modules.catalog_service', init_globals=dict(_module_fqn='ansible_collections.openstack.cloud.plugins.modules.catalog_service', _modlib_path=modlib_path),\n File \"\", line 226, in run_module\n File \"\", line 98, in _run_module_code\n File \"\", line 88, in _run_code\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_2p_9rkfu/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 211, in \n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_2p_9rkfu/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 207, in main\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_2p_9rkfu/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/module_utils/openstack.py\", line 417, in __call__\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_2p_9rkfu/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 113, in run\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_2p_9rkfu/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 175, in _find\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 91, in __get__\n proxy = self._make_proxy(instance)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 289, in _make_proxy\n found_version = temp_adapter.get_api_major_version()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/adapter.py\", line 403, in get_api_major_version\n return self.session.get_api_major_version(auth or self.auth, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1478, in get_api_major_version\n return auth.get_api_major_version(self, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 573, in get_api_major_version\n data = get_endpoint_data(discover_versions=discover_versions)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 296, in get_endpoint_data\n service_catalog = self.get_access(session).service_catalog\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 139, in get_access\n self.auth_ref = self.get_auth_ref(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 221, in get_auth_ref\n plugin = self._do_create_plugin(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 163, in _do_create_plugin\n raise exceptions.DiscoveryFailure(\nkeystoneauth1.exceptions.discovery.DiscoveryFailure: Could not find versioned identity endpoints when attempting to authenticate. Please check that your auth_url is correct. Service Unavailable (HTTP 503)\n", "module_stdout": "", "msg": "MODULE FAILURE: No start of json char found\nSee stdout/stderr for the exact error", "rc": 1} 2026-01-06 01:00:59.473858 | orchestrator | 2026-01-06 01:00:59.473869 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:00:59.473881 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-01-06 01:00:59.473894 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:00:59.473913 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:00:59.473925 | orchestrator | 2026-01-06 01:00:59.473936 | orchestrator | 2026-01-06 01:00:59.473947 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:00:59.473958 | orchestrator | Tuesday 06 January 2026 01:00:56 +0000 (0:01:05.035) 0:01:06.882 ******* 2026-01-06 01:00:59.473970 | orchestrator | =============================================================================== 2026-01-06 01:00:59.473980 | orchestrator | service-ks-register : barbican | Creating/deleting services ------------ 65.04s 2026-01-06 01:00:59.473991 | orchestrator | barbican : include_tasks ------------------------------------------------ 0.61s 2026-01-06 01:00:59.474002 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.53s 2026-01-06 01:00:59.474014 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.40s 2026-01-06 01:00:59.476430 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:00:59.480739 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:00:59.481321 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state STARTED 2026-01-06 01:00:59.482249 | orchestrator | 2026-01-06 01:00:59 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:00:59.482840 | orchestrator | 2026-01-06 01:00:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:02.520041 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:02.520617 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:02.521727 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:02.522762 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:02.523889 | orchestrator | 2026-01-06 01:01:02.523926 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task 4a38bb35-07c5-4299-ab53-8f226c4d2a3b is in state SUCCESS 2026-01-06 01:01:02.524508 | orchestrator | 2026-01-06 01:01:02.524527 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:01:02.524537 | orchestrator | 2026-01-06 01:01:02.524544 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:01:02.524552 | orchestrator | Tuesday 06 January 2026 00:59:49 +0000 (0:00:00.356) 0:00:00.356 ******* 2026-01-06 01:01:02.524559 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:01:02.524568 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:01:02.524575 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:01:02.524582 | orchestrator | ok: [testbed-node-3] 2026-01-06 01:01:02.524589 | orchestrator | ok: [testbed-node-4] 2026-01-06 01:01:02.524596 | orchestrator | ok: [testbed-node-5] 2026-01-06 01:01:02.524602 | orchestrator | 2026-01-06 01:01:02.524609 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:01:02.524642 | orchestrator | Tuesday 06 January 2026 00:59:50 +0000 (0:00:00.793) 0:00:01.150 ******* 2026-01-06 01:01:02.524649 | orchestrator | ok: [testbed-node-0] => (item=enable_neutron_True) 2026-01-06 01:01:02.524656 | orchestrator | ok: [testbed-node-1] => (item=enable_neutron_True) 2026-01-06 01:01:02.524663 | orchestrator | ok: [testbed-node-2] => (item=enable_neutron_True) 2026-01-06 01:01:02.524670 | orchestrator | ok: [testbed-node-3] => (item=enable_neutron_True) 2026-01-06 01:01:02.524677 | orchestrator | ok: [testbed-node-4] => (item=enable_neutron_True) 2026-01-06 01:01:02.524683 | orchestrator | ok: [testbed-node-5] => (item=enable_neutron_True) 2026-01-06 01:01:02.524690 | orchestrator | 2026-01-06 01:01:02.524697 | orchestrator | PLAY [Apply role neutron] ****************************************************** 2026-01-06 01:01:02.524703 | orchestrator | 2026-01-06 01:01:02.524710 | orchestrator | TASK [neutron : include_tasks] ************************************************* 2026-01-06 01:01:02.524717 | orchestrator | Tuesday 06 January 2026 00:59:50 +0000 (0:00:00.646) 0:00:01.796 ******* 2026-01-06 01:01:02.524728 | orchestrator | included: /ansible/roles/neutron/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 01:01:02.524741 | orchestrator | 2026-01-06 01:01:02.524752 | orchestrator | TASK [neutron : Get container facts] ******************************************* 2026-01-06 01:01:02.524764 | orchestrator | Tuesday 06 January 2026 00:59:52 +0000 (0:00:01.332) 0:00:03.129 ******* 2026-01-06 01:01:02.524775 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:01:02.524787 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:01:02.524800 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:01:02.524812 | orchestrator | ok: [testbed-node-3] 2026-01-06 01:01:02.524824 | orchestrator | ok: [testbed-node-4] 2026-01-06 01:01:02.524836 | orchestrator | ok: [testbed-node-5] 2026-01-06 01:01:02.524849 | orchestrator | 2026-01-06 01:01:02.524857 | orchestrator | TASK [neutron : Get container volume facts] ************************************ 2026-01-06 01:01:02.524864 | orchestrator | Tuesday 06 January 2026 00:59:53 +0000 (0:00:01.423) 0:00:04.552 ******* 2026-01-06 01:01:02.524871 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:01:02.524877 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:01:02.524885 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:01:02.524896 | orchestrator | ok: [testbed-node-3] 2026-01-06 01:01:02.524906 | orchestrator | ok: [testbed-node-4] 2026-01-06 01:01:02.524917 | orchestrator | ok: [testbed-node-5] 2026-01-06 01:01:02.524927 | orchestrator | 2026-01-06 01:01:02.524939 | orchestrator | TASK [neutron : Check for ML2/OVN presence] ************************************ 2026-01-06 01:01:02.524950 | orchestrator | Tuesday 06 January 2026 00:59:54 +0000 (0:00:01.128) 0:00:05.680 ******* 2026-01-06 01:01:02.524981 | orchestrator | ok: [testbed-node-0] => { 2026-01-06 01:01:02.524993 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525005 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525017 | orchestrator | } 2026-01-06 01:01:02.525025 | orchestrator | ok: [testbed-node-1] => { 2026-01-06 01:01:02.525031 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525038 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525045 | orchestrator | } 2026-01-06 01:01:02.525052 | orchestrator | ok: [testbed-node-2] => { 2026-01-06 01:01:02.525059 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525065 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525072 | orchestrator | } 2026-01-06 01:01:02.525079 | orchestrator | ok: [testbed-node-3] => { 2026-01-06 01:01:02.525085 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525092 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525099 | orchestrator | } 2026-01-06 01:01:02.525105 | orchestrator | ok: [testbed-node-4] => { 2026-01-06 01:01:02.525112 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525119 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525125 | orchestrator | } 2026-01-06 01:01:02.525133 | orchestrator | ok: [testbed-node-5] => { 2026-01-06 01:01:02.525141 | orchestrator |  "changed": false, 2026-01-06 01:01:02.525157 | orchestrator |  "msg": "All assertions passed" 2026-01-06 01:01:02.525169 | orchestrator | } 2026-01-06 01:01:02.525180 | orchestrator | 2026-01-06 01:01:02.525192 | orchestrator | TASK [neutron : Check for ML2/OVS presence] ************************************ 2026-01-06 01:01:02.525204 | orchestrator | Tuesday 06 January 2026 00:59:55 +0000 (0:00:00.848) 0:00:06.528 ******* 2026-01-06 01:01:02.525215 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:01:02.525227 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:01:02.525238 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:01:02.525250 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:01:02.525262 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:01:02.525295 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:01:02.525307 | orchestrator | 2026-01-06 01:01:02.525318 | orchestrator | TASK [service-ks-register : neutron | Creating/deleting services] ************** 2026-01-06 01:01:02.525330 | orchestrator | Tuesday 06 January 2026 00:59:56 +0000 (0:00:00.650) 0:00:07.178 ******* 2026-01-06 01:01:02.525342 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (5 retries left). 2026-01-06 01:01:02.525356 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (4 retries left). 2026-01-06 01:01:02.525368 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (3 retries left). 2026-01-06 01:01:02.525380 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (2 retries left). 2026-01-06 01:01:02.525393 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (1 retries left). 2026-01-06 01:01:02.525445 | orchestrator | failed: [testbed-node-0] (item=neutron (network)) => {"action": "openstack.cloud.catalog_service", "ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Openstack Networking", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9696"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9696"}], "name": "neutron", "type": "network"}, "module_stderr": "Failed to discover available identity versions when contacting https://api-int.testbed.osism.xyz:5000. Attempting to parse version from URL.\nTraceback (most recent call last):\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 136, in _do_create_plugin\n disc = self.get_discovery(\n ^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 703, in get_discovery\n return discover.get_discovery(\n ^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 1742, in get_discovery\n disc = Discover(session, url, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 585, in __init__\n self._data = get_version_data(\n ^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 114, in get_version_data\n resp = session.get(url, headers=headers, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1320, in get\n return self.request(url, 'GET', **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1118, in request\n raise exceptions.from_response(resp, method, url)\nkeystoneauth1.exceptions.http.ServiceUnavailable: Service Unavailable (HTTP 503)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/tmp/ansible-tmp-1767661259.4892554-3418-81118532323568/AnsiballZ_catalog_service.py\", line 107, in \n _ansiballz_main()\n File \"/tmp/ansible-tmp-1767661259.4892554-3418-81118532323568/AnsiballZ_catalog_service.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/tmp/ansible-tmp-1767661259.4892554-3418-81118532323568/AnsiballZ_catalog_service.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.openstack.cloud.plugins.modules.catalog_service', init_globals=dict(_module_fqn='ansible_collections.openstack.cloud.plugins.modules.catalog_service', _modlib_path=modlib_path),\n File \"\", line 226, in run_module\n File \"\", line 98, in _run_module_code\n File \"\", line 88, in _run_code\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_oeoytfzd/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 211, in \n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_oeoytfzd/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 207, in main\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_oeoytfzd/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/module_utils/openstack.py\", line 417, in __call__\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_oeoytfzd/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 113, in run\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_oeoytfzd/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 175, in _find\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 91, in __get__\n proxy = self._make_proxy(instance)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 289, in _make_proxy\n found_version = temp_adapter.get_api_major_version()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/adapter.py\", line 403, in get_api_major_version\n return self.session.get_api_major_version(auth or self.auth, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1478, in get_api_major_version\n return auth.get_api_major_version(self, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 573, in get_api_major_version\n data = get_endpoint_data(discover_versions=discover_versions)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 296, in get_endpoint_data\n service_catalog = self.get_access(session).service_catalog\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 139, in get_access\n self.auth_ref = self.get_auth_ref(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 221, in get_auth_ref\n plugin = self._do_create_plugin(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 163, in _do_create_plugin\n raise exceptions.DiscoveryFailure(\nkeystoneauth1.exceptions.discovery.DiscoveryFailure: Could not find versioned identity endpoints when attempting to authenticate. Please check that your auth_url is correct. Service Unavailable (HTTP 503)\n", "module_stdout": "", "msg": "MODULE FAILURE: No start of json char found\nSee stdout/stderr for the exact error", "rc": 1} 2026-01-06 01:01:02.525467 | orchestrator | 2026-01-06 01:01:02.525476 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:01:02.525490 | orchestrator | testbed-node-0 : ok=6  changed=0 unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525590 | orchestrator | testbed-node-1 : ok=6  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525601 | orchestrator | testbed-node-2 : ok=6  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525608 | orchestrator | testbed-node-3 : ok=6  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525615 | orchestrator | testbed-node-4 : ok=6  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525622 | orchestrator | testbed-node-5 : ok=6  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:01:02.525628 | orchestrator | 2026-01-06 01:01:02.525635 | orchestrator | 2026-01-06 01:01:02.525642 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:01:02.525649 | orchestrator | Tuesday 06 January 2026 01:01:00 +0000 (0:01:04.393) 0:01:11.572 ******* 2026-01-06 01:01:02.525656 | orchestrator | =============================================================================== 2026-01-06 01:01:02.525662 | orchestrator | service-ks-register : neutron | Creating/deleting services ------------- 64.39s 2026-01-06 01:01:02.525669 | orchestrator | neutron : Get container facts ------------------------------------------- 1.42s 2026-01-06 01:01:02.525676 | orchestrator | neutron : include_tasks ------------------------------------------------- 1.33s 2026-01-06 01:01:02.525682 | orchestrator | neutron : Get container volume facts ------------------------------------ 1.13s 2026-01-06 01:01:02.525689 | orchestrator | neutron : Check for ML2/OVN presence ------------------------------------ 0.85s 2026-01-06 01:01:02.525695 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.79s 2026-01-06 01:01:02.525757 | orchestrator | neutron : Check for ML2/OVS presence ------------------------------------ 0.65s 2026-01-06 01:01:02.525768 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.65s 2026-01-06 01:01:02.525774 | orchestrator | 2026-01-06 01:01:02 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:02.525785 | orchestrator | 2026-01-06 01:01:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:05.569307 | orchestrator | 2026-01-06 01:01:05 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:05.573641 | orchestrator | 2026-01-06 01:01:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:05.573907 | orchestrator | 2026-01-06 01:01:05 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:05.575967 | orchestrator | 2026-01-06 01:01:05 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:05.577514 | orchestrator | 2026-01-06 01:01:05 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:05.577559 | orchestrator | 2026-01-06 01:01:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:08.624386 | orchestrator | 2026-01-06 01:01:08 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:08.627016 | orchestrator | 2026-01-06 01:01:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:08.629744 | orchestrator | 2026-01-06 01:01:08 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:08.632075 | orchestrator | 2026-01-06 01:01:08 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:08.634358 | orchestrator | 2026-01-06 01:01:08 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:08.634746 | orchestrator | 2026-01-06 01:01:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:11.678844 | orchestrator | 2026-01-06 01:01:11 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:11.681520 | orchestrator | 2026-01-06 01:01:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:11.683637 | orchestrator | 2026-01-06 01:01:11 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:11.686044 | orchestrator | 2026-01-06 01:01:11 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:11.688080 | orchestrator | 2026-01-06 01:01:11 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:11.688285 | orchestrator | 2026-01-06 01:01:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:14.733634 | orchestrator | 2026-01-06 01:01:14 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:14.734502 | orchestrator | 2026-01-06 01:01:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:14.736095 | orchestrator | 2026-01-06 01:01:14 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:14.737897 | orchestrator | 2026-01-06 01:01:14 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:14.739747 | orchestrator | 2026-01-06 01:01:14 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:14.739793 | orchestrator | 2026-01-06 01:01:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:17.785882 | orchestrator | 2026-01-06 01:01:17 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:17.788395 | orchestrator | 2026-01-06 01:01:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:17.790663 | orchestrator | 2026-01-06 01:01:17 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:17.792365 | orchestrator | 2026-01-06 01:01:17 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:17.793609 | orchestrator | 2026-01-06 01:01:17 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:17.793640 | orchestrator | 2026-01-06 01:01:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:20.839666 | orchestrator | 2026-01-06 01:01:20 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:20.841915 | orchestrator | 2026-01-06 01:01:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:20.843847 | orchestrator | 2026-01-06 01:01:20 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:20.845429 | orchestrator | 2026-01-06 01:01:20 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:20.847032 | orchestrator | 2026-01-06 01:01:20 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:20.847079 | orchestrator | 2026-01-06 01:01:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:23.904040 | orchestrator | 2026-01-06 01:01:23 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:23.905584 | orchestrator | 2026-01-06 01:01:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:23.907125 | orchestrator | 2026-01-06 01:01:23 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:23.908545 | orchestrator | 2026-01-06 01:01:23 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:23.910531 | orchestrator | 2026-01-06 01:01:23 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:23.910568 | orchestrator | 2026-01-06 01:01:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:26.964805 | orchestrator | 2026-01-06 01:01:26 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:26.967478 | orchestrator | 2026-01-06 01:01:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:26.969496 | orchestrator | 2026-01-06 01:01:26 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:26.971407 | orchestrator | 2026-01-06 01:01:26 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:26.974193 | orchestrator | 2026-01-06 01:01:26 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:26.974286 | orchestrator | 2026-01-06 01:01:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:30.027333 | orchestrator | 2026-01-06 01:01:30 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:30.029339 | orchestrator | 2026-01-06 01:01:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:30.032298 | orchestrator | 2026-01-06 01:01:30 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:30.034914 | orchestrator | 2026-01-06 01:01:30 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:30.035717 | orchestrator | 2026-01-06 01:01:30 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:30.035766 | orchestrator | 2026-01-06 01:01:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:33.081012 | orchestrator | 2026-01-06 01:01:33 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:33.082643 | orchestrator | 2026-01-06 01:01:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:33.085313 | orchestrator | 2026-01-06 01:01:33 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:33.088667 | orchestrator | 2026-01-06 01:01:33 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:33.090658 | orchestrator | 2026-01-06 01:01:33 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:33.090723 | orchestrator | 2026-01-06 01:01:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:36.133891 | orchestrator | 2026-01-06 01:01:36 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:36.135803 | orchestrator | 2026-01-06 01:01:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:36.138441 | orchestrator | 2026-01-06 01:01:36 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:36.140117 | orchestrator | 2026-01-06 01:01:36 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:36.141994 | orchestrator | 2026-01-06 01:01:36 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:36.142088 | orchestrator | 2026-01-06 01:01:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:39.192615 | orchestrator | 2026-01-06 01:01:39 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:39.196355 | orchestrator | 2026-01-06 01:01:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:39.197605 | orchestrator | 2026-01-06 01:01:39 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:39.199376 | orchestrator | 2026-01-06 01:01:39 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:39.200963 | orchestrator | 2026-01-06 01:01:39 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:39.201020 | orchestrator | 2026-01-06 01:01:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:42.249960 | orchestrator | 2026-01-06 01:01:42 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:42.251591 | orchestrator | 2026-01-06 01:01:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:42.253023 | orchestrator | 2026-01-06 01:01:42 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:42.254433 | orchestrator | 2026-01-06 01:01:42 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:42.255876 | orchestrator | 2026-01-06 01:01:42 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:42.255922 | orchestrator | 2026-01-06 01:01:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:45.310190 | orchestrator | 2026-01-06 01:01:45 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:45.312329 | orchestrator | 2026-01-06 01:01:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:45.315031 | orchestrator | 2026-01-06 01:01:45 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:45.317892 | orchestrator | 2026-01-06 01:01:45 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:45.320747 | orchestrator | 2026-01-06 01:01:45 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:45.320859 | orchestrator | 2026-01-06 01:01:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:48.371878 | orchestrator | 2026-01-06 01:01:48 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:48.373738 | orchestrator | 2026-01-06 01:01:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:48.375453 | orchestrator | 2026-01-06 01:01:48 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:48.378384 | orchestrator | 2026-01-06 01:01:48 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:48.380824 | orchestrator | 2026-01-06 01:01:48 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:48.380860 | orchestrator | 2026-01-06 01:01:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:51.427646 | orchestrator | 2026-01-06 01:01:51 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:51.429519 | orchestrator | 2026-01-06 01:01:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:51.431684 | orchestrator | 2026-01-06 01:01:51 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:51.433520 | orchestrator | 2026-01-06 01:01:51 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:51.435132 | orchestrator | 2026-01-06 01:01:51 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:51.435277 | orchestrator | 2026-01-06 01:01:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:54.483006 | orchestrator | 2026-01-06 01:01:54 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:54.484602 | orchestrator | 2026-01-06 01:01:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:54.487190 | orchestrator | 2026-01-06 01:01:54 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:54.489269 | orchestrator | 2026-01-06 01:01:54 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:54.491072 | orchestrator | 2026-01-06 01:01:54 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:54.491118 | orchestrator | 2026-01-06 01:01:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:01:57.538208 | orchestrator | 2026-01-06 01:01:57 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:01:57.539206 | orchestrator | 2026-01-06 01:01:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:01:57.541194 | orchestrator | 2026-01-06 01:01:57 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:01:57.543077 | orchestrator | 2026-01-06 01:01:57 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:01:57.544049 | orchestrator | 2026-01-06 01:01:57 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:01:57.544150 | orchestrator | 2026-01-06 01:01:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:00.594591 | orchestrator | 2026-01-06 01:02:00 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:00.596519 | orchestrator | 2026-01-06 01:02:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:00.598330 | orchestrator | 2026-01-06 01:02:00 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:00.600674 | orchestrator | 2026-01-06 01:02:00 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:02:00.602429 | orchestrator | 2026-01-06 01:02:00 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:02:00.602474 | orchestrator | 2026-01-06 01:02:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:03.651625 | orchestrator | 2026-01-06 01:02:03 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:03.653871 | orchestrator | 2026-01-06 01:02:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:03.656047 | orchestrator | 2026-01-06 01:02:03 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:03.657597 | orchestrator | 2026-01-06 01:02:03 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:02:03.659149 | orchestrator | 2026-01-06 01:02:03 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:02:03.659190 | orchestrator | 2026-01-06 01:02:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:06.703393 | orchestrator | 2026-01-06 01:02:06 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:06.707171 | orchestrator | 2026-01-06 01:02:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:06.708936 | orchestrator | 2026-01-06 01:02:06 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:06.714269 | orchestrator | 2026-01-06 01:02:06 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state STARTED 2026-01-06 01:02:06.715983 | orchestrator | 2026-01-06 01:02:06 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state STARTED 2026-01-06 01:02:06.716494 | orchestrator | 2026-01-06 01:02:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:09.783337 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:09.787590 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:09.790837 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:09.793275 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task 94e547ca-7edb-411a-8799-c6e8d7bfdfb9 is in state SUCCESS 2026-01-06 01:02:09.793900 | orchestrator | 2026-01-06 01:02:09.793953 | orchestrator | 2026-01-06 01:02:09.793970 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:02:09.793986 | orchestrator | 2026-01-06 01:02:09.794000 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:02:09.794014 | orchestrator | Tuesday 06 January 2026 01:01:00 +0000 (0:00:00.264) 0:00:00.264 ******* 2026-01-06 01:02:09.794094 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:02:09.794109 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:02:09.794121 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:02:09.794134 | orchestrator | 2026-01-06 01:02:09.794180 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:02:09.794312 | orchestrator | Tuesday 06 January 2026 01:01:01 +0000 (0:00:00.360) 0:00:00.624 ******* 2026-01-06 01:02:09.794329 | orchestrator | ok: [testbed-node-0] => (item=enable_magnum_True) 2026-01-06 01:02:09.794343 | orchestrator | ok: [testbed-node-1] => (item=enable_magnum_True) 2026-01-06 01:02:09.794356 | orchestrator | ok: [testbed-node-2] => (item=enable_magnum_True) 2026-01-06 01:02:09.794370 | orchestrator | 2026-01-06 01:02:09.794384 | orchestrator | PLAY [Apply role magnum] ******************************************************* 2026-01-06 01:02:09.794397 | orchestrator | 2026-01-06 01:02:09.794411 | orchestrator | TASK [magnum : include_tasks] ************************************************** 2026-01-06 01:02:09.794423 | orchestrator | Tuesday 06 January 2026 01:01:01 +0000 (0:00:00.510) 0:00:01.134 ******* 2026-01-06 01:02:09.794432 | orchestrator | included: /ansible/roles/magnum/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:02:09.794442 | orchestrator | 2026-01-06 01:02:09.794450 | orchestrator | TASK [service-ks-register : magnum | Creating/deleting services] *************** 2026-01-06 01:02:09.794459 | orchestrator | Tuesday 06 January 2026 01:01:02 +0000 (0:00:00.653) 0:00:01.788 ******* 2026-01-06 01:02:09.794469 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (5 retries left). 2026-01-06 01:02:09.794480 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (4 retries left). 2026-01-06 01:02:09.794489 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (3 retries left). 2026-01-06 01:02:09.794499 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (2 retries left). 2026-01-06 01:02:09.794509 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (1 retries left). 2026-01-06 01:02:09.794556 | orchestrator | failed: [testbed-node-0] (item=magnum (container-infra)) => {"action": "openstack.cloud.catalog_service", "ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Container Infrastructure Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9511/v1"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9511/v1"}], "name": "magnum", "type": "container-infra"}, "module_stderr": "Failed to discover available identity versions when contacting https://api-int.testbed.osism.xyz:5000. Attempting to parse version from URL.\nTraceback (most recent call last):\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 136, in _do_create_plugin\n disc = self.get_discovery(\n ^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 703, in get_discovery\n return discover.get_discovery(\n ^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 1742, in get_discovery\n disc = Discover(session, url, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 585, in __init__\n self._data = get_version_data(\n ^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 114, in get_version_data\n resp = session.get(url, headers=headers, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1320, in get\n return self.request(url, 'GET', **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1118, in request\n raise exceptions.from_response(resp, method, url)\nkeystoneauth1.exceptions.http.ServiceUnavailable: Service Unavailable (HTTP 503)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/tmp/ansible-tmp-1767661325.1519895-3822-47987241435302/AnsiballZ_catalog_service.py\", line 107, in \n _ansiballz_main()\n File \"/tmp/ansible-tmp-1767661325.1519895-3822-47987241435302/AnsiballZ_catalog_service.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/tmp/ansible-tmp-1767661325.1519895-3822-47987241435302/AnsiballZ_catalog_service.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.openstack.cloud.plugins.modules.catalog_service', init_globals=dict(_module_fqn='ansible_collections.openstack.cloud.plugins.modules.catalog_service', _modlib_path=modlib_path),\n File \"\", line 226, in run_module\n File \"\", line 98, in _run_module_code\n File \"\", line 88, in _run_code\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_utmser5b/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 211, in \n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_utmser5b/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 207, in main\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_utmser5b/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/module_utils/openstack.py\", line 417, in __call__\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_utmser5b/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 113, in run\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_utmser5b/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 175, in _find\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 91, in __get__\n proxy = self._make_proxy(instance)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 289, in _make_proxy\n found_version = temp_adapter.get_api_major_version()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/adapter.py\", line 403, in get_api_major_version\n return self.session.get_api_major_version(auth or self.auth, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1478, in get_api_major_version\n return auth.get_api_major_version(self, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 573, in get_api_major_version\n data = get_endpoint_data(discover_versions=discover_versions)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 296, in get_endpoint_data\n service_catalog = self.get_access(session).service_catalog\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 139, in get_access\n self.auth_ref = self.get_auth_ref(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 221, in get_auth_ref\n plugin = self._do_create_plugin(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 163, in _do_create_plugin\n raise exceptions.DiscoveryFailure(\nkeystoneauth1.exceptions.discovery.DiscoveryFailure: Could not find versioned identity endpoints when attempting to authenticate. Please check that your auth_url is correct. Service Unavailable (HTTP 503)\n", "module_stdout": "", "msg": "MODULE FAILURE: No start of json char found\nSee stdout/stderr for the exact error", "rc": 1} 2026-01-06 01:02:09.794617 | orchestrator | 2026-01-06 01:02:09.794627 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:02:09.794635 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.794645 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.794655 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.794663 | orchestrator | 2026-01-06 01:02:09.794671 | orchestrator | 2026-01-06 01:02:09.794679 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:02:09.794687 | orchestrator | Tuesday 06 January 2026 01:02:06 +0000 (0:01:04.070) 0:01:05.858 ******* 2026-01-06 01:02:09.794695 | orchestrator | =============================================================================== 2026-01-06 01:02:09.794703 | orchestrator | service-ks-register : magnum | Creating/deleting services -------------- 64.07s 2026-01-06 01:02:09.794721 | orchestrator | magnum : include_tasks -------------------------------------------------- 0.65s 2026-01-06 01:02:09.794729 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.51s 2026-01-06 01:02:09.794737 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.36s 2026-01-06 01:02:09.795737 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:09.798264 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:09.800420 | orchestrator | 2026-01-06 01:02:09 | INFO  | Task 298dec91-5199-45fa-8812-29354b4b0d1f is in state SUCCESS 2026-01-06 01:02:09.800814 | orchestrator | 2026-01-06 01:02:09.800840 | orchestrator | 2026-01-06 01:02:09.800848 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:02:09.800856 | orchestrator | 2026-01-06 01:02:09.800885 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:02:09.800892 | orchestrator | Tuesday 06 January 2026 01:01:00 +0000 (0:00:00.267) 0:00:00.267 ******* 2026-01-06 01:02:09.800899 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:02:09.800907 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:02:09.800913 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:02:09.800920 | orchestrator | 2026-01-06 01:02:09.800926 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:02:09.800933 | orchestrator | Tuesday 06 January 2026 01:01:01 +0000 (0:00:00.377) 0:00:00.645 ******* 2026-01-06 01:02:09.800940 | orchestrator | ok: [testbed-node-0] => (item=enable_placement_True) 2026-01-06 01:02:09.800949 | orchestrator | ok: [testbed-node-1] => (item=enable_placement_True) 2026-01-06 01:02:09.800960 | orchestrator | ok: [testbed-node-2] => (item=enable_placement_True) 2026-01-06 01:02:09.800967 | orchestrator | 2026-01-06 01:02:09.800973 | orchestrator | PLAY [Apply role placement] **************************************************** 2026-01-06 01:02:09.800979 | orchestrator | 2026-01-06 01:02:09.800986 | orchestrator | TASK [placement : include_tasks] *********************************************** 2026-01-06 01:02:09.800992 | orchestrator | Tuesday 06 January 2026 01:01:01 +0000 (0:00:00.581) 0:00:01.226 ******* 2026-01-06 01:02:09.800998 | orchestrator | included: /ansible/roles/placement/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:02:09.801006 | orchestrator | 2026-01-06 01:02:09.801012 | orchestrator | TASK [service-ks-register : placement | Creating/deleting services] ************ 2026-01-06 01:02:09.801018 | orchestrator | Tuesday 06 January 2026 01:01:02 +0000 (0:00:00.585) 0:00:01.811 ******* 2026-01-06 01:02:09.801025 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (5 retries left). 2026-01-06 01:02:09.801031 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (4 retries left). 2026-01-06 01:02:09.801037 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (3 retries left). 2026-01-06 01:02:09.801044 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (2 retries left). 2026-01-06 01:02:09.801050 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (1 retries left). 2026-01-06 01:02:09.801086 | orchestrator | failed: [testbed-node-0] (item=placement (placement)) => {"action": "openstack.cloud.catalog_service", "ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Placement Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:8780"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:8780"}], "name": "placement", "type": "placement"}, "module_stderr": "Failed to discover available identity versions when contacting https://api-int.testbed.osism.xyz:5000. Attempting to parse version from URL.\nTraceback (most recent call last):\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 136, in _do_create_plugin\n disc = self.get_discovery(\n ^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 703, in get_discovery\n return discover.get_discovery(\n ^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 1742, in get_discovery\n disc = Discover(session, url, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 585, in __init__\n self._data = get_version_data(\n ^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/discover.py\", line 114, in get_version_data\n resp = session.get(url, headers=headers, authenticated=authenticated)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1320, in get\n return self.request(url, 'GET', **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1118, in request\n raise exceptions.from_response(resp, method, url)\nkeystoneauth1.exceptions.http.ServiceUnavailable: Service Unavailable (HTTP 503)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/tmp/ansible-tmp-1767661325.2090936-3833-61181664793335/AnsiballZ_catalog_service.py\", line 107, in \n _ansiballz_main()\n File \"/tmp/ansible-tmp-1767661325.2090936-3833-61181664793335/AnsiballZ_catalog_service.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/tmp/ansible-tmp-1767661325.2090936-3833-61181664793335/AnsiballZ_catalog_service.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.openstack.cloud.plugins.modules.catalog_service', init_globals=dict(_module_fqn='ansible_collections.openstack.cloud.plugins.modules.catalog_service', _modlib_path=modlib_path),\n File \"\", line 226, in run_module\n File \"\", line 98, in _run_module_code\n File \"\", line 88, in _run_code\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_9uwcjezm/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 211, in \n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_9uwcjezm/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 207, in main\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_9uwcjezm/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/module_utils/openstack.py\", line 417, in __call__\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_9uwcjezm/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 113, in run\n File \"/tmp/ansible_openstack.cloud.catalog_service_payload_9uwcjezm/ansible_openstack.cloud.catalog_service_payload.zip/ansible_collections/openstack/cloud/plugins/modules/catalog_service.py\", line 175, in _find\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 91, in __get__\n proxy = self._make_proxy(instance)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/openstack/service_description.py\", line 289, in _make_proxy\n found_version = temp_adapter.get_api_major_version()\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/adapter.py\", line 403, in get_api_major_version\n return self.session.get_api_major_version(auth or self.auth, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/session.py\", line 1478, in get_api_major_version\n return auth.get_api_major_version(self, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 573, in get_api_major_version\n data = get_endpoint_data(discover_versions=discover_versions)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 296, in get_endpoint_data\n service_catalog = self.get_access(session).service_catalog\n ^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/base.py\", line 139, in get_access\n self.auth_ref = self.get_auth_ref(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 221, in get_auth_ref\n plugin = self._do_create_plugin(session)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/ansible/lib/python3.12/site-packages/keystoneauth1/identity/generic/base.py\", line 163, in _do_create_plugin\n raise exceptions.DiscoveryFailure(\nkeystoneauth1.exceptions.discovery.DiscoveryFailure: Could not find versioned identity endpoints when attempting to authenticate. Please check that your auth_url is correct. Service Unavailable (HTTP 503)\n", "module_stdout": "", "msg": "MODULE FAILURE: No start of json char found\nSee stdout/stderr for the exact error", "rc": 1} 2026-01-06 01:02:09.801120 | orchestrator | 2026-01-06 01:02:09.801138 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:02:09.801149 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.801160 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.801171 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:02:09.801182 | orchestrator | 2026-01-06 01:02:09.801192 | orchestrator | 2026-01-06 01:02:09.801253 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:02:09.801267 | orchestrator | Tuesday 06 January 2026 01:02:06 +0000 (0:01:04.204) 0:01:06.016 ******* 2026-01-06 01:02:09.801278 | orchestrator | =============================================================================== 2026-01-06 01:02:09.801287 | orchestrator | service-ks-register : placement | Creating/deleting services ----------- 64.20s 2026-01-06 01:02:09.801293 | orchestrator | placement : include_tasks ----------------------------------------------- 0.59s 2026-01-06 01:02:09.801300 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.58s 2026-01-06 01:02:09.801306 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.38s 2026-01-06 01:02:09.801312 | orchestrator | 2026-01-06 01:02:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:12.837769 | orchestrator | 2026-01-06 01:02:12 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:12.838396 | orchestrator | 2026-01-06 01:02:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:12.839354 | orchestrator | 2026-01-06 01:02:12 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:12.840900 | orchestrator | 2026-01-06 01:02:12 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:12.841698 | orchestrator | 2026-01-06 01:02:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:12.841744 | orchestrator | 2026-01-06 01:02:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:15.877928 | orchestrator | 2026-01-06 01:02:15 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:15.879492 | orchestrator | 2026-01-06 01:02:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:15.880433 | orchestrator | 2026-01-06 01:02:15 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:15.881176 | orchestrator | 2026-01-06 01:02:15 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:15.882675 | orchestrator | 2026-01-06 01:02:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:15.882723 | orchestrator | 2026-01-06 01:02:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:18.932591 | orchestrator | 2026-01-06 01:02:18 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:18.934469 | orchestrator | 2026-01-06 01:02:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:18.936741 | orchestrator | 2026-01-06 01:02:18 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:18.940731 | orchestrator | 2026-01-06 01:02:18 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:18.941915 | orchestrator | 2026-01-06 01:02:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:18.942430 | orchestrator | 2026-01-06 01:02:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:22.233533 | orchestrator | 2026-01-06 01:02:22 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state STARTED 2026-01-06 01:02:22.236882 | orchestrator | 2026-01-06 01:02:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:22.241462 | orchestrator | 2026-01-06 01:02:22 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:22.241532 | orchestrator | 2026-01-06 01:02:22 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:22.241538 | orchestrator | 2026-01-06 01:02:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:22.241543 | orchestrator | 2026-01-06 01:02:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:25.308943 | orchestrator | 2026-01-06 01:02:25 | INFO  | Task f97f4b23-7614-481d-be0f-6747b044af29 is in state SUCCESS 2026-01-06 01:02:25.311956 | orchestrator | 2026-01-06 01:02:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:25.312608 | orchestrator | 2026-01-06 01:02:25 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:25.316676 | orchestrator | 2026-01-06 01:02:25 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:25.320049 | orchestrator | 2026-01-06 01:02:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:25.320364 | orchestrator | 2026-01-06 01:02:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:28.356799 | orchestrator | 2026-01-06 01:02:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:28.357281 | orchestrator | 2026-01-06 01:02:28 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:28.357907 | orchestrator | 2026-01-06 01:02:28 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:28.358603 | orchestrator | 2026-01-06 01:02:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:28.360342 | orchestrator | 2026-01-06 01:02:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:31.396790 | orchestrator | 2026-01-06 01:02:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:31.396878 | orchestrator | 2026-01-06 01:02:31 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:31.396973 | orchestrator | 2026-01-06 01:02:31 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:31.397419 | orchestrator | 2026-01-06 01:02:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:31.397443 | orchestrator | 2026-01-06 01:02:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:34.440331 | orchestrator | 2026-01-06 01:02:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:34.440505 | orchestrator | 2026-01-06 01:02:34 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:34.443929 | orchestrator | 2026-01-06 01:02:34 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:34.447891 | orchestrator | 2026-01-06 01:02:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:34.447954 | orchestrator | 2026-01-06 01:02:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:37.492642 | orchestrator | 2026-01-06 01:02:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:37.493823 | orchestrator | 2026-01-06 01:02:37 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:37.496972 | orchestrator | 2026-01-06 01:02:37 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:37.497816 | orchestrator | 2026-01-06 01:02:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:37.498047 | orchestrator | 2026-01-06 01:02:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:40.527983 | orchestrator | 2026-01-06 01:02:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:40.530258 | orchestrator | 2026-01-06 01:02:40 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:40.532541 | orchestrator | 2026-01-06 01:02:40 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:40.534897 | orchestrator | 2026-01-06 01:02:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:40.534939 | orchestrator | 2026-01-06 01:02:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:43.586143 | orchestrator | 2026-01-06 01:02:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:43.594437 | orchestrator | 2026-01-06 01:02:43 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:43.598323 | orchestrator | 2026-01-06 01:02:43 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:43.604694 | orchestrator | 2026-01-06 01:02:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:43.605015 | orchestrator | 2026-01-06 01:02:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:46.648629 | orchestrator | 2026-01-06 01:02:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:46.649801 | orchestrator | 2026-01-06 01:02:46 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:46.651046 | orchestrator | 2026-01-06 01:02:46 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:46.652300 | orchestrator | 2026-01-06 01:02:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:46.652720 | orchestrator | 2026-01-06 01:02:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:49.704487 | orchestrator | 2026-01-06 01:02:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:49.706915 | orchestrator | 2026-01-06 01:02:49 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:49.709070 | orchestrator | 2026-01-06 01:02:49 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:49.711396 | orchestrator | 2026-01-06 01:02:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:49.711454 | orchestrator | 2026-01-06 01:02:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:52.758573 | orchestrator | 2026-01-06 01:02:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:52.760519 | orchestrator | 2026-01-06 01:02:52 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:52.761840 | orchestrator | 2026-01-06 01:02:52 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:52.763659 | orchestrator | 2026-01-06 01:02:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:52.764036 | orchestrator | 2026-01-06 01:02:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:55.814617 | orchestrator | 2026-01-06 01:02:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:55.817438 | orchestrator | 2026-01-06 01:02:55 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:55.819289 | orchestrator | 2026-01-06 01:02:55 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:55.821172 | orchestrator | 2026-01-06 01:02:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:55.821240 | orchestrator | 2026-01-06 01:02:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:02:58.863225 | orchestrator | 2026-01-06 01:02:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:02:58.864772 | orchestrator | 2026-01-06 01:02:58 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:02:58.867060 | orchestrator | 2026-01-06 01:02:58 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:02:58.869250 | orchestrator | 2026-01-06 01:02:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:02:58.869476 | orchestrator | 2026-01-06 01:02:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:01.913172 | orchestrator | 2026-01-06 01:03:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:01.914642 | orchestrator | 2026-01-06 01:03:01 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:01.916277 | orchestrator | 2026-01-06 01:03:01 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:01.918290 | orchestrator | 2026-01-06 01:03:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:01.918317 | orchestrator | 2026-01-06 01:03:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:04.964103 | orchestrator | 2026-01-06 01:03:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:04.964292 | orchestrator | 2026-01-06 01:03:04 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:04.964309 | orchestrator | 2026-01-06 01:03:04 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:04.964399 | orchestrator | 2026-01-06 01:03:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:04.964415 | orchestrator | 2026-01-06 01:03:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:08.009770 | orchestrator | 2026-01-06 01:03:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:08.010369 | orchestrator | 2026-01-06 01:03:08 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:08.011421 | orchestrator | 2026-01-06 01:03:08 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:08.012983 | orchestrator | 2026-01-06 01:03:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:08.013075 | orchestrator | 2026-01-06 01:03:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:11.053750 | orchestrator | 2026-01-06 01:03:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:11.055655 | orchestrator | 2026-01-06 01:03:11 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:11.058212 | orchestrator | 2026-01-06 01:03:11 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:11.059950 | orchestrator | 2026-01-06 01:03:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:11.060107 | orchestrator | 2026-01-06 01:03:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:14.110477 | orchestrator | 2026-01-06 01:03:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:14.110556 | orchestrator | 2026-01-06 01:03:14 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:14.111489 | orchestrator | 2026-01-06 01:03:14 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:14.112670 | orchestrator | 2026-01-06 01:03:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:14.112848 | orchestrator | 2026-01-06 01:03:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:17.151584 | orchestrator | 2026-01-06 01:03:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:17.153175 | orchestrator | 2026-01-06 01:03:17 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:17.155287 | orchestrator | 2026-01-06 01:03:17 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:17.157473 | orchestrator | 2026-01-06 01:03:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:17.157524 | orchestrator | 2026-01-06 01:03:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:20.205295 | orchestrator | 2026-01-06 01:03:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:20.205389 | orchestrator | 2026-01-06 01:03:20 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:20.206302 | orchestrator | 2026-01-06 01:03:20 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:20.207676 | orchestrator | 2026-01-06 01:03:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:20.207775 | orchestrator | 2026-01-06 01:03:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:23.256611 | orchestrator | 2026-01-06 01:03:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:23.256985 | orchestrator | 2026-01-06 01:03:23 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:23.258318 | orchestrator | 2026-01-06 01:03:23 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:23.259348 | orchestrator | 2026-01-06 01:03:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:23.259583 | orchestrator | 2026-01-06 01:03:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:26.306411 | orchestrator | 2026-01-06 01:03:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:26.307645 | orchestrator | 2026-01-06 01:03:26 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:26.309027 | orchestrator | 2026-01-06 01:03:26 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:26.310932 | orchestrator | 2026-01-06 01:03:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:26.311005 | orchestrator | 2026-01-06 01:03:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:29.358542 | orchestrator | 2026-01-06 01:03:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:29.360509 | orchestrator | 2026-01-06 01:03:29 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:29.363310 | orchestrator | 2026-01-06 01:03:29 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:29.365783 | orchestrator | 2026-01-06 01:03:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:29.365861 | orchestrator | 2026-01-06 01:03:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:32.407467 | orchestrator | 2026-01-06 01:03:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:32.409043 | orchestrator | 2026-01-06 01:03:32 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:32.411964 | orchestrator | 2026-01-06 01:03:32 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:32.413632 | orchestrator | 2026-01-06 01:03:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:32.413698 | orchestrator | 2026-01-06 01:03:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:35.455351 | orchestrator | 2026-01-06 01:03:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:35.456646 | orchestrator | 2026-01-06 01:03:35 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:35.457822 | orchestrator | 2026-01-06 01:03:35 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:35.458988 | orchestrator | 2026-01-06 01:03:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:35.459028 | orchestrator | 2026-01-06 01:03:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:38.500740 | orchestrator | 2026-01-06 01:03:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:38.502172 | orchestrator | 2026-01-06 01:03:38 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:38.503928 | orchestrator | 2026-01-06 01:03:38 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:38.505792 | orchestrator | 2026-01-06 01:03:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:38.506065 | orchestrator | 2026-01-06 01:03:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:41.570220 | orchestrator | 2026-01-06 01:03:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:41.570328 | orchestrator | 2026-01-06 01:03:41 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:41.570345 | orchestrator | 2026-01-06 01:03:41 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:41.570357 | orchestrator | 2026-01-06 01:03:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:41.570369 | orchestrator | 2026-01-06 01:03:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:44.605024 | orchestrator | 2026-01-06 01:03:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:44.605221 | orchestrator | 2026-01-06 01:03:44 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:44.606268 | orchestrator | 2026-01-06 01:03:44 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:44.608032 | orchestrator | 2026-01-06 01:03:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:44.608084 | orchestrator | 2026-01-06 01:03:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:47.648208 | orchestrator | 2026-01-06 01:03:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:47.649301 | orchestrator | 2026-01-06 01:03:47 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:47.650746 | orchestrator | 2026-01-06 01:03:47 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:47.652043 | orchestrator | 2026-01-06 01:03:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:47.652080 | orchestrator | 2026-01-06 01:03:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:50.693729 | orchestrator | 2026-01-06 01:03:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:50.694394 | orchestrator | 2026-01-06 01:03:50 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:50.695809 | orchestrator | 2026-01-06 01:03:50 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:50.696958 | orchestrator | 2026-01-06 01:03:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:50.697039 | orchestrator | 2026-01-06 01:03:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:53.744806 | orchestrator | 2026-01-06 01:03:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:53.747221 | orchestrator | 2026-01-06 01:03:53 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:53.748868 | orchestrator | 2026-01-06 01:03:53 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:53.750654 | orchestrator | 2026-01-06 01:03:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:53.750698 | orchestrator | 2026-01-06 01:03:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:56.803248 | orchestrator | 2026-01-06 01:03:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:56.806541 | orchestrator | 2026-01-06 01:03:56 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:56.810572 | orchestrator | 2026-01-06 01:03:56 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:56.812627 | orchestrator | 2026-01-06 01:03:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:56.812680 | orchestrator | 2026-01-06 01:03:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:03:59.856780 | orchestrator | 2026-01-06 01:03:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:03:59.858203 | orchestrator | 2026-01-06 01:03:59 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:03:59.860354 | orchestrator | 2026-01-06 01:03:59 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:03:59.862685 | orchestrator | 2026-01-06 01:03:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:03:59.862730 | orchestrator | 2026-01-06 01:03:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:02.913879 | orchestrator | 2026-01-06 01:04:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:02.915702 | orchestrator | 2026-01-06 01:04:02 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:02.917831 | orchestrator | 2026-01-06 01:04:02 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:02.920517 | orchestrator | 2026-01-06 01:04:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:02.920673 | orchestrator | 2026-01-06 01:04:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:05.968990 | orchestrator | 2026-01-06 01:04:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:05.969075 | orchestrator | 2026-01-06 01:04:05 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:05.969082 | orchestrator | 2026-01-06 01:04:05 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:05.970679 | orchestrator | 2026-01-06 01:04:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:05.970699 | orchestrator | 2026-01-06 01:04:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:09.017703 | orchestrator | 2026-01-06 01:04:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:09.018906 | orchestrator | 2026-01-06 01:04:09 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:09.020274 | orchestrator | 2026-01-06 01:04:09 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:09.021383 | orchestrator | 2026-01-06 01:04:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:09.021455 | orchestrator | 2026-01-06 01:04:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:12.063182 | orchestrator | 2026-01-06 01:04:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:12.064223 | orchestrator | 2026-01-06 01:04:12 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:12.064932 | orchestrator | 2026-01-06 01:04:12 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:12.066399 | orchestrator | 2026-01-06 01:04:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:12.066524 | orchestrator | 2026-01-06 01:04:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:15.119689 | orchestrator | 2026-01-06 01:04:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:15.121637 | orchestrator | 2026-01-06 01:04:15 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:15.124016 | orchestrator | 2026-01-06 01:04:15 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:15.126335 | orchestrator | 2026-01-06 01:04:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:15.126394 | orchestrator | 2026-01-06 01:04:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:18.172336 | orchestrator | 2026-01-06 01:04:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:18.179382 | orchestrator | 2026-01-06 01:04:18 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:18.180654 | orchestrator | 2026-01-06 01:04:18 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:18.182477 | orchestrator | 2026-01-06 01:04:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:18.182514 | orchestrator | 2026-01-06 01:04:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:21.234138 | orchestrator | 2026-01-06 01:04:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:21.235407 | orchestrator | 2026-01-06 01:04:21 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:21.237042 | orchestrator | 2026-01-06 01:04:21 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:21.238654 | orchestrator | 2026-01-06 01:04:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:21.238739 | orchestrator | 2026-01-06 01:04:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:24.286113 | orchestrator | 2026-01-06 01:04:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:24.288528 | orchestrator | 2026-01-06 01:04:24 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:24.293203 | orchestrator | 2026-01-06 01:04:24 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:24.295249 | orchestrator | 2026-01-06 01:04:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:24.295661 | orchestrator | 2026-01-06 01:04:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:27.347876 | orchestrator | 2026-01-06 01:04:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:27.349821 | orchestrator | 2026-01-06 01:04:27 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:27.351726 | orchestrator | 2026-01-06 01:04:27 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:27.353560 | orchestrator | 2026-01-06 01:04:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:27.353609 | orchestrator | 2026-01-06 01:04:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:30.401381 | orchestrator | 2026-01-06 01:04:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:30.402753 | orchestrator | 2026-01-06 01:04:30 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:30.404396 | orchestrator | 2026-01-06 01:04:30 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:30.405536 | orchestrator | 2026-01-06 01:04:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:30.405582 | orchestrator | 2026-01-06 01:04:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:33.452126 | orchestrator | 2026-01-06 01:04:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:33.454956 | orchestrator | 2026-01-06 01:04:33 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:33.454993 | orchestrator | 2026-01-06 01:04:33 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:33.455227 | orchestrator | 2026-01-06 01:04:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:33.455655 | orchestrator | 2026-01-06 01:04:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:36.497804 | orchestrator | 2026-01-06 01:04:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:36.501031 | orchestrator | 2026-01-06 01:04:36 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:36.504139 | orchestrator | 2026-01-06 01:04:36 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:36.506205 | orchestrator | 2026-01-06 01:04:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:36.506569 | orchestrator | 2026-01-06 01:04:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:39.551200 | orchestrator | 2026-01-06 01:04:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:39.551475 | orchestrator | 2026-01-06 01:04:39 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:39.553199 | orchestrator | 2026-01-06 01:04:39 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:39.554617 | orchestrator | 2026-01-06 01:04:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:39.555044 | orchestrator | 2026-01-06 01:04:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:42.605883 | orchestrator | 2026-01-06 01:04:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:42.607392 | orchestrator | 2026-01-06 01:04:42 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:42.609830 | orchestrator | 2026-01-06 01:04:42 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:42.612196 | orchestrator | 2026-01-06 01:04:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:42.612268 | orchestrator | 2026-01-06 01:04:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:45.668290 | orchestrator | 2026-01-06 01:04:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:45.668399 | orchestrator | 2026-01-06 01:04:45 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:45.668415 | orchestrator | 2026-01-06 01:04:45 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:45.668426 | orchestrator | 2026-01-06 01:04:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:45.668438 | orchestrator | 2026-01-06 01:04:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:48.719115 | orchestrator | 2026-01-06 01:04:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:48.719337 | orchestrator | 2026-01-06 01:04:48 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:48.720909 | orchestrator | 2026-01-06 01:04:48 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:48.721871 | orchestrator | 2026-01-06 01:04:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:48.721900 | orchestrator | 2026-01-06 01:04:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:51.788177 | orchestrator | 2026-01-06 01:04:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:51.788270 | orchestrator | 2026-01-06 01:04:51 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:51.791479 | orchestrator | 2026-01-06 01:04:51 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:51.794139 | orchestrator | 2026-01-06 01:04:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:51.794201 | orchestrator | 2026-01-06 01:04:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:54.840880 | orchestrator | 2026-01-06 01:04:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:54.842620 | orchestrator | 2026-01-06 01:04:54 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:54.844591 | orchestrator | 2026-01-06 01:04:54 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state STARTED 2026-01-06 01:04:54.845603 | orchestrator | 2026-01-06 01:04:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:54.845633 | orchestrator | 2026-01-06 01:04:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:04:57.894481 | orchestrator | 2026-01-06 01:04:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:04:57.896493 | orchestrator | 2026-01-06 01:04:57 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:04:57.902405 | orchestrator | 2026-01-06 01:04:57 | INFO  | Task 928d32b0-3f6a-4b86-970b-dfdcd10bbfe4 is in state SUCCESS 2026-01-06 01:04:57.903407 | orchestrator | 2026-01-06 01:04:57.903447 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-01-06 01:04:57.903459 | orchestrator | 2.16.14 2026-01-06 01:04:57.903470 | orchestrator | 2026-01-06 01:04:57.903480 | orchestrator | PLAY [Bootstraph ceph dashboard] *********************************************** 2026-01-06 01:04:57.903491 | orchestrator | 2026-01-06 01:04:57.903502 | orchestrator | TASK [Disable the ceph dashboard] ********************************************** 2026-01-06 01:04:57.903521 | orchestrator | Tuesday 06 January 2026 01:00:59 +0000 (0:00:00.272) 0:00:00.273 ******* 2026-01-06 01:04:57.903538 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903555 | orchestrator | 2026-01-06 01:04:57.903571 | orchestrator | TASK [Set mgr/dashboard/ssl to false] ****************************************** 2026-01-06 01:04:57.903587 | orchestrator | Tuesday 06 January 2026 01:01:00 +0000 (0:00:01.586) 0:00:01.859 ******* 2026-01-06 01:04:57.903605 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903622 | orchestrator | 2026-01-06 01:04:57.903639 | orchestrator | TASK [Set mgr/dashboard/server_port to 7000] *********************************** 2026-01-06 01:04:57.903656 | orchestrator | Tuesday 06 January 2026 01:01:01 +0000 (0:00:01.050) 0:00:02.910 ******* 2026-01-06 01:04:57.903668 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903677 | orchestrator | 2026-01-06 01:04:57.903687 | orchestrator | TASK [Set mgr/dashboard/server_addr to 0.0.0.0] ******************************** 2026-01-06 01:04:57.903698 | orchestrator | Tuesday 06 January 2026 01:01:02 +0000 (0:00:01.013) 0:00:03.924 ******* 2026-01-06 01:04:57.903708 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903718 | orchestrator | 2026-01-06 01:04:57.903728 | orchestrator | TASK [Set mgr/dashboard/standby_behaviour to error] **************************** 2026-01-06 01:04:57.903737 | orchestrator | Tuesday 06 January 2026 01:01:03 +0000 (0:00:01.239) 0:00:05.163 ******* 2026-01-06 01:04:57.903747 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903757 | orchestrator | 2026-01-06 01:04:57.903767 | orchestrator | TASK [Set mgr/dashboard/standby_error_status_code to 404] ********************** 2026-01-06 01:04:57.903776 | orchestrator | Tuesday 06 January 2026 01:01:04 +0000 (0:00:01.015) 0:00:06.178 ******* 2026-01-06 01:04:57.903786 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903796 | orchestrator | 2026-01-06 01:04:57.903806 | orchestrator | TASK [Enable the ceph dashboard] *********************************************** 2026-01-06 01:04:57.903815 | orchestrator | Tuesday 06 January 2026 01:01:05 +0000 (0:00:00.959) 0:00:07.138 ******* 2026-01-06 01:04:57.903825 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903835 | orchestrator | 2026-01-06 01:04:57.903845 | orchestrator | TASK [Write ceph_dashboard_password to temporary file] ************************* 2026-01-06 01:04:57.903881 | orchestrator | Tuesday 06 January 2026 01:01:07 +0000 (0:00:01.141) 0:00:08.279 ******* 2026-01-06 01:04:57.903892 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903902 | orchestrator | 2026-01-06 01:04:57.903911 | orchestrator | TASK [Create admin user] ******************************************************* 2026-01-06 01:04:57.903921 | orchestrator | Tuesday 06 January 2026 01:01:08 +0000 (0:00:01.083) 0:00:09.363 ******* 2026-01-06 01:04:57.903931 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.903941 | orchestrator | 2026-01-06 01:04:57.903950 | orchestrator | TASK [Remove temporary file for ceph_dashboard_password] *********************** 2026-01-06 01:04:57.903960 | orchestrator | Tuesday 06 January 2026 01:01:57 +0000 (0:00:49.648) 0:00:59.011 ******* 2026-01-06 01:04:57.903970 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.903980 | orchestrator | 2026-01-06 01:04:57.903990 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2026-01-06 01:04:57.904000 | orchestrator | 2026-01-06 01:04:57.904010 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2026-01-06 01:04:57.904021 | orchestrator | Tuesday 06 January 2026 01:01:57 +0000 (0:00:00.165) 0:00:59.177 ******* 2026-01-06 01:04:57.904034 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.904160 | orchestrator | 2026-01-06 01:04:57.904256 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2026-01-06 01:04:57.904387 | orchestrator | 2026-01-06 01:04:57.904404 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2026-01-06 01:04:57.904416 | orchestrator | Tuesday 06 January 2026 01:01:59 +0000 (0:00:01.619) 0:01:00.796 ******* 2026-01-06 01:04:57.904429 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.904438 | orchestrator | 2026-01-06 01:04:57.904448 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2026-01-06 01:04:57.904458 | orchestrator | 2026-01-06 01:04:57.904467 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2026-01-06 01:04:57.904477 | orchestrator | Tuesday 06 January 2026 01:02:10 +0000 (0:00:11.312) 0:01:12.108 ******* 2026-01-06 01:04:57.904486 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.904496 | orchestrator | 2026-01-06 01:04:57.904506 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:04:57.904517 | orchestrator | testbed-manager : ok=9  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-01-06 01:04:57.904528 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:04:57.904538 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:04:57.904546 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-01-06 01:04:57.904554 | orchestrator | 2026-01-06 01:04:57.904562 | orchestrator | 2026-01-06 01:04:57.904570 | orchestrator | 2026-01-06 01:04:57.904578 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:04:57.904586 | orchestrator | Tuesday 06 January 2026 01:02:22 +0000 (0:00:11.297) 0:01:23.406 ******* 2026-01-06 01:04:57.904594 | orchestrator | =============================================================================== 2026-01-06 01:04:57.904602 | orchestrator | Create admin user ------------------------------------------------------ 49.65s 2026-01-06 01:04:57.904624 | orchestrator | Restart ceph manager service ------------------------------------------- 24.23s 2026-01-06 01:04:57.904632 | orchestrator | Disable the ceph dashboard ---------------------------------------------- 1.59s 2026-01-06 01:04:57.904640 | orchestrator | Set mgr/dashboard/server_addr to 0.0.0.0 -------------------------------- 1.24s 2026-01-06 01:04:57.904648 | orchestrator | Enable the ceph dashboard ----------------------------------------------- 1.14s 2026-01-06 01:04:57.904656 | orchestrator | Write ceph_dashboard_password to temporary file ------------------------- 1.08s 2026-01-06 01:04:57.904673 | orchestrator | Set mgr/dashboard/ssl to false ------------------------------------------ 1.05s 2026-01-06 01:04:57.904681 | orchestrator | Set mgr/dashboard/standby_behaviour to error ---------------------------- 1.02s 2026-01-06 01:04:57.904689 | orchestrator | Set mgr/dashboard/server_port to 7000 ----------------------------------- 1.01s 2026-01-06 01:04:57.904719 | orchestrator | Set mgr/dashboard/standby_error_status_code to 404 ---------------------- 0.96s 2026-01-06 01:04:57.904728 | orchestrator | Remove temporary file for ceph_dashboard_password ----------------------- 0.17s 2026-01-06 01:04:57.904737 | orchestrator | 2026-01-06 01:04:57.904751 | orchestrator | 2026-01-06 01:04:57.904759 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:04:57.904767 | orchestrator | 2026-01-06 01:04:57.904775 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:04:57.904783 | orchestrator | Tuesday 06 January 2026 01:02:11 +0000 (0:00:00.274) 0:00:00.274 ******* 2026-01-06 01:04:57.904791 | orchestrator | ok: [testbed-manager] 2026-01-06 01:04:57.904799 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:04:57.904826 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:04:57.904835 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:04:57.904843 | orchestrator | ok: [testbed-node-3] 2026-01-06 01:04:57.904917 | orchestrator | ok: [testbed-node-4] 2026-01-06 01:04:57.904957 | orchestrator | ok: [testbed-node-5] 2026-01-06 01:04:57.904965 | orchestrator | 2026-01-06 01:04:57.904974 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:04:57.904988 | orchestrator | Tuesday 06 January 2026 01:02:12 +0000 (0:00:00.797) 0:00:01.072 ******* 2026-01-06 01:04:57.904999 | orchestrator | ok: [testbed-manager] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905012 | orchestrator | ok: [testbed-node-0] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905020 | orchestrator | ok: [testbed-node-1] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905028 | orchestrator | ok: [testbed-node-2] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905161 | orchestrator | ok: [testbed-node-3] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905171 | orchestrator | ok: [testbed-node-4] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905179 | orchestrator | ok: [testbed-node-5] => (item=enable_prometheus_True) 2026-01-06 01:04:57.905194 | orchestrator | 2026-01-06 01:04:57.905228 | orchestrator | PLAY [Apply role prometheus] *************************************************** 2026-01-06 01:04:57.905240 | orchestrator | 2026-01-06 01:04:57.905252 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-01-06 01:04:57.905260 | orchestrator | Tuesday 06 January 2026 01:02:13 +0000 (0:00:00.826) 0:00:01.899 ******* 2026-01-06 01:04:57.905268 | orchestrator | included: /ansible/roles/prometheus/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 01:04:57.905277 | orchestrator | 2026-01-06 01:04:57.905285 | orchestrator | TASK [prometheus : Ensuring config directories exist] ************************** 2026-01-06 01:04:57.905293 | orchestrator | Tuesday 06 January 2026 01:02:14 +0000 (0:00:01.457) 0:00:03.357 ******* 2026-01-06 01:04:57.905304 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905318 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 01:04:57.905344 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905355 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905363 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905373 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905382 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905390 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905398 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905412 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905427 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905437 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905446 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905455 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905464 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905472 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905486 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905495 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905510 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:04:57.905520 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905528 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905537 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905545 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905558 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905572 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905581 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905589 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905598 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905606 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905614 | orchestrator | 2026-01-06 01:04:57.905623 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-01-06 01:04:57.905631 | orchestrator | Tuesday 06 January 2026 01:02:17 +0000 (0:00:02.738) 0:00:06.095 ******* 2026-01-06 01:04:57.905639 | orchestrator | included: /ansible/roles/prometheus/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-01-06 01:04:57.905652 | orchestrator | 2026-01-06 01:04:57.905661 | orchestrator | TASK [service-cert-copy : prometheus | Copying over extra CA certificates] ***** 2026-01-06 01:04:57.905668 | orchestrator | Tuesday 06 January 2026 01:02:18 +0000 (0:00:01.432) 0:00:07.528 ******* 2026-01-06 01:04:57.905677 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 01:04:57.905692 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905701 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905710 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905718 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905726 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905740 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905748 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.905756 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905770 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905778 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905787 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905795 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905804 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905817 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.905826 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.905834 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906475 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906503 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906512 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906520 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906540 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:04:57.906549 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906557 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906575 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.906583 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906592 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906600 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906615 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.906623 | orchestrator | 2026-01-06 01:04:57.906631 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS certificate] *** 2026-01-06 01:04:57.906639 | orchestrator | Tuesday 06 January 2026 01:02:24 +0000 (0:00:05.942) 0:00:13.470 ******* 2026-01-06 01:04:57.906648 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906657 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906670 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 01:04:57.906680 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906689 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906702 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906710 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906718 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906727 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906739 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906747 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906755 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906768 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.906777 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906785 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.906794 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906802 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906810 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906819 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906833 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:04:57.906847 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906855 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.906864 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906872 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906880 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.906888 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906897 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906905 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906920 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.906928 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.906947 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.906971 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906980 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.906989 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.906998 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907006 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.907014 | orchestrator | 2026-01-06 01:04:57.907023 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS key] *** 2026-01-06 01:04:57.907031 | orchestrator | Tuesday 06 January 2026 01:02:26 +0000 (0:00:02.374) 0:00:15.845 ******* 2026-01-06 01:04:57.907076 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 01:04:57.907096 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907113 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907123 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907140 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907150 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907160 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907171 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:04:57.907188 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907203 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907217 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907227 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.907238 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907248 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907258 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.907268 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907278 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907288 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907308 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907319 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907331 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907340 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907348 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.907357 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907365 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907374 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.907382 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.907396 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.907408 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907417 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907425 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.907437 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907446 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.907454 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907463 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.907471 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.907479 | orchestrator | 2026-01-06 01:04:57.907488 | orchestrator | TASK [prometheus : Copying over config.json files] ***************************** 2026-01-06 01:04:57.907496 | orchestrator | Tuesday 06 January 2026 01:02:29 +0000 (0:00:02.668) 0:00:18.513 ******* 2026-01-06 01:04:57.907507 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907533 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907554 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 01:04:57.907574 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907589 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907604 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907618 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907637 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907646 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.907660 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907669 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907681 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907690 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907698 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907707 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907721 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907733 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907742 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907750 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907798 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:04:57.907808 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907817 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907831 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907844 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907852 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.907861 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907873 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907881 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907890 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.907903 | orchestrator | 2026-01-06 01:04:57.907911 | orchestrator | TASK [prometheus : Find custom prometheus alert rules files] ******************* 2026-01-06 01:04:57.907919 | orchestrator | Tuesday 06 January 2026 01:02:35 +0000 (0:00:05.682) 0:00:24.196 ******* 2026-01-06 01:04:57.907927 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 01:04:57.907935 | orchestrator | 2026-01-06 01:04:57.907943 | orchestrator | TASK [prometheus : Copying over custom prometheus alert rules files] *********** 2026-01-06 01:04:57.907951 | orchestrator | Tuesday 06 January 2026 01:02:36 +0000 (0:00:01.051) 0:00:25.247 ******* 2026-01-06 01:04:57.907959 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.907967 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.907975 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.907983 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.907991 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.907999 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.908006 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.908014 | orchestrator | 2026-01-06 01:04:57.908022 | orchestrator | TASK [prometheus : Find prometheus common config overrides] ******************** 2026-01-06 01:04:57.908030 | orchestrator | Tuesday 06 January 2026 01:02:36 +0000 (0:00:00.622) 0:00:25.870 ******* 2026-01-06 01:04:57.908061 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 01:04:57.908070 | orchestrator | 2026-01-06 01:04:57.908078 | orchestrator | TASK [prometheus : Find prometheus host config overrides] ********************** 2026-01-06 01:04:57.908086 | orchestrator | Tuesday 06 January 2026 01:02:37 +0000 (0:00:00.662) 0:00:26.532 ******* 2026-01-06 01:04:57.908094 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908103 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908112 | orchestrator | manager/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908119 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908127 | orchestrator | manager/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908136 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 01:04:57.908148 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908156 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908164 | orchestrator | node-0/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908172 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908180 | orchestrator | node-0/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908188 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908197 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908205 | orchestrator | node-1/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908212 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908220 | orchestrator | node-1/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908228 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908237 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908244 | orchestrator | node-2/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908252 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908261 | orchestrator | node-2/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908269 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908277 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908290 | orchestrator | node-3/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908298 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908306 | orchestrator | node-3/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908314 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908326 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908335 | orchestrator | node-4/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908343 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908350 | orchestrator | node-4/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908358 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.908366 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908374 | orchestrator | node-5/prometheus.yml.d' path due to this access issue: 2026-01-06 01:04:57.908382 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-01-06 01:04:57.908390 | orchestrator | node-5/prometheus.yml.d' is not a directory 2026-01-06 01:04:57.908398 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 01:04:57.908406 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-01-06 01:04:57.908414 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-01-06 01:04:57.908422 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-01-06 01:04:57.908430 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-01-06 01:04:57.908438 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-01-06 01:04:57.908446 | orchestrator | 2026-01-06 01:04:57.908454 | orchestrator | TASK [prometheus : Copying over prometheus config file] ************************ 2026-01-06 01:04:57.908462 | orchestrator | Tuesday 06 January 2026 01:02:39 +0000 (0:00:01.533) 0:00:28.066 ******* 2026-01-06 01:04:57.908470 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908479 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.908487 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908495 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.908503 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908511 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.908519 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908527 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908535 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.908543 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.908551 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-01-06 01:04:57.908559 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.908567 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2) 2026-01-06 01:04:57.908575 | orchestrator | 2026-01-06 01:04:57.908583 | orchestrator | TASK [prometheus : Copying over prometheus web config file] ******************** 2026-01-06 01:04:57.908591 | orchestrator | Tuesday 06 January 2026 01:02:54 +0000 (0:00:14.839) 0:00:42.906 ******* 2026-01-06 01:04:57.908599 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908607 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.908615 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908623 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908631 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.908639 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.908652 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908660 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.908673 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908681 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.908689 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-01-06 01:04:57.908697 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.908705 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2) 2026-01-06 01:04:57.908713 | orchestrator | 2026-01-06 01:04:57.908721 | orchestrator | TASK [prometheus : Copying over prometheus alertmanager config file] *********** 2026-01-06 01:04:57.908729 | orchestrator | Tuesday 06 January 2026 01:02:57 +0000 (0:00:03.412) 0:00:46.319 ******* 2026-01-06 01:04:57.908737 | orchestrator | skipping: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908745 | orchestrator | skipping: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908753 | orchestrator | skipping: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908761 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.908769 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.908777 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.908785 | orchestrator | skipping: [testbed-node-3] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908793 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.908805 | orchestrator | skipping: [testbed-node-4] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908813 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.908821 | orchestrator | skipping: [testbed-node-5] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-01-06 01:04:57.908829 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.908837 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml) 2026-01-06 01:04:57.908845 | orchestrator | 2026-01-06 01:04:57.908853 | orchestrator | TASK [prometheus : Find custom Alertmanager alert notification templates] ****** 2026-01-06 01:04:57.908861 | orchestrator | Tuesday 06 January 2026 01:02:59 +0000 (0:00:01.634) 0:00:47.954 ******* 2026-01-06 01:04:57.908869 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 01:04:57.908877 | orchestrator | 2026-01-06 01:04:57.908885 | orchestrator | TASK [prometheus : Copying over custom Alertmanager alert notification templates] *** 2026-01-06 01:04:57.908893 | orchestrator | Tuesday 06 January 2026 01:02:59 +0000 (0:00:00.726) 0:00:48.681 ******* 2026-01-06 01:04:57.908902 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.908910 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.908918 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.908925 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.908933 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.908941 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.908949 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.908957 | orchestrator | 2026-01-06 01:04:57.908965 | orchestrator | TASK [prometheus : Copying over my.cnf for mysqld_exporter] ******************** 2026-01-06 01:04:57.908973 | orchestrator | Tuesday 06 January 2026 01:03:00 +0000 (0:00:00.714) 0:00:49.395 ******* 2026-01-06 01:04:57.908981 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.908989 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.908997 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.909010 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.909018 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.909026 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.909034 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.909096 | orchestrator | 2026-01-06 01:04:57.909104 | orchestrator | TASK [prometheus : Copying cloud config file for openstack exporter] *********** 2026-01-06 01:04:57.909113 | orchestrator | Tuesday 06 January 2026 01:03:02 +0000 (0:00:02.172) 0:00:51.567 ******* 2026-01-06 01:04:57.909120 | orchestrator | skipping: [testbed-manager] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909128 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.909136 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909144 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.909152 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909160 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909168 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.909176 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.909184 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909192 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.909200 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909208 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.909216 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-01-06 01:04:57.909223 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.909231 | orchestrator | 2026-01-06 01:04:57.909239 | orchestrator | TASK [prometheus : Copying config file for blackbox exporter] ****************** 2026-01-06 01:04:57.909247 | orchestrator | Tuesday 06 January 2026 01:03:04 +0000 (0:00:01.684) 0:00:53.252 ******* 2026-01-06 01:04:57.909421 | orchestrator | skipping: [testbed-node-0]2026-01-06 01:04:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:04:57.909436 | orchestrator | => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909445 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.909458 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909469 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.909477 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909485 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909493 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.909501 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.909509 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909517 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.909529 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-01-06 01:04:57.909537 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.909545 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2) 2026-01-06 01:04:57.909553 | orchestrator | 2026-01-06 01:04:57.909561 | orchestrator | TASK [prometheus : Find extra prometheus server config files] ****************** 2026-01-06 01:04:57.909574 | orchestrator | Tuesday 06 January 2026 01:03:06 +0000 (0:00:01.695) 0:00:54.948 ******* 2026-01-06 01:04:57.909583 | orchestrator | [WARNING]: Skipped 2026-01-06 01:04:57.909591 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' path 2026-01-06 01:04:57.909606 | orchestrator | due to this access issue: 2026-01-06 01:04:57.909613 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' is 2026-01-06 01:04:57.909620 | orchestrator | not a directory 2026-01-06 01:04:57.909627 | orchestrator | ok: [testbed-manager -> localhost] 2026-01-06 01:04:57.909634 | orchestrator | 2026-01-06 01:04:57.909640 | orchestrator | TASK [prometheus : Create subdirectories for extra config files] *************** 2026-01-06 01:04:57.909647 | orchestrator | Tuesday 06 January 2026 01:03:07 +0000 (0:00:01.130) 0:00:56.078 ******* 2026-01-06 01:04:57.909654 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.909660 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.909667 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.909674 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.909680 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.909687 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.909694 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.909700 | orchestrator | 2026-01-06 01:04:57.909707 | orchestrator | TASK [prometheus : Template extra prometheus server config files] ************** 2026-01-06 01:04:57.909714 | orchestrator | Tuesday 06 January 2026 01:03:08 +0000 (0:00:00.948) 0:00:57.027 ******* 2026-01-06 01:04:57.909720 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.909727 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.909733 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.909740 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.909746 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.909753 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.909760 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.909766 | orchestrator | 2026-01-06 01:04:57.909773 | orchestrator | TASK [service-check-containers : prometheus | Check containers] **************** 2026-01-06 01:04:57.909780 | orchestrator | Tuesday 06 January 2026 01:03:08 +0000 (0:00:00.717) 0:00:57.744 ******* 2026-01-06 01:04:57.909787 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909794 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909807 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909815 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-01-06 01:04:57.909831 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909838 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909845 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909853 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909860 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909871 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909878 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-01-06 01:04:57.909889 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909900 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909907 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909914 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909921 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909928 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909941 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.909955 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909962 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909973 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909980 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:04:57.909988 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.909995 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.910007 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-01-06 01:04:57.910121 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.910138 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.910145 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.910152 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-01-06 01:04:57.910159 | orchestrator | 2026-01-06 01:04:57.910168 | orchestrator | TASK [service-check-containers : prometheus | Notify handlers to restart containers] *** 2026-01-06 01:04:57.910180 | orchestrator | Tuesday 06 January 2026 01:03:13 +0000 (0:00:04.506) 0:01:02.250 ******* 2026-01-06 01:04:57.910189 | orchestrator | changed: [testbed-manager] => { 2026-01-06 01:04:57.910200 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910209 | orchestrator | } 2026-01-06 01:04:57.910218 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 01:04:57.910227 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910236 | orchestrator | } 2026-01-06 01:04:57.910245 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 01:04:57.910254 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910264 | orchestrator | } 2026-01-06 01:04:57.910275 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 01:04:57.910285 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910295 | orchestrator | } 2026-01-06 01:04:57.910305 | orchestrator | changed: [testbed-node-3] => { 2026-01-06 01:04:57.910311 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910318 | orchestrator | } 2026-01-06 01:04:57.910324 | orchestrator | changed: [testbed-node-4] => { 2026-01-06 01:04:57.910330 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910336 | orchestrator | } 2026-01-06 01:04:57.910342 | orchestrator | changed: [testbed-node-5] => { 2026-01-06 01:04:57.910349 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:04:57.910360 | orchestrator | } 2026-01-06 01:04:57.910366 | orchestrator | 2026-01-06 01:04:57.910373 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 01:04:57.910379 | orchestrator | Tuesday 06 January 2026 01:03:14 +0000 (0:00:00.918) 0:01:03.168 ******* 2026-01-06 01:04:57.910393 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-server:2025.1', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-01-06 01:04:57.910401 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910411 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910419 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2025.1', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:04:57.910426 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910437 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910443 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910453 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910460 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910470 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910477 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.910484 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910490 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910497 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910508 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910518 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910525 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:04:57.910532 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:04:57.910538 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910545 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910555 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910562 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910569 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-01-06 01:04:57.910579 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:04:57.910586 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910592 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910603 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910609 | orchestrator | skipping: [testbed-node-3] 2026-01-06 01:04:57.910616 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910626 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910633 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910640 | orchestrator | skipping: [testbed-node-4] 2026-01-06 01:04:57.910646 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2025.1', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-01-06 01:04:57.910657 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2025.1', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910664 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2025.1', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-01-06 01:04:57.910671 | orchestrator | skipping: [testbed-node-5] 2026-01-06 01:04:57.910677 | orchestrator | 2026-01-06 01:04:57.910683 | orchestrator | TASK [prometheus : Creating prometheus database user and setting permissions] *** 2026-01-06 01:04:57.910690 | orchestrator | Tuesday 06 January 2026 01:03:16 +0000 (0:00:01.966) 0:01:05.135 ******* 2026-01-06 01:04:57.910696 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-01-06 01:04:57.910703 | orchestrator | skipping: [testbed-manager] 2026-01-06 01:04:57.910709 | orchestrator | 2026-01-06 01:04:57.910715 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910722 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:01.179) 0:01:06.315 ******* 2026-01-06 01:04:57.910728 | orchestrator | 2026-01-06 01:04:57.910734 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910741 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:00.070) 0:01:06.385 ******* 2026-01-06 01:04:57.910747 | orchestrator | 2026-01-06 01:04:57.910757 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910763 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:00.064) 0:01:06.449 ******* 2026-01-06 01:04:57.910769 | orchestrator | 2026-01-06 01:04:57.910776 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910782 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:00.064) 0:01:06.514 ******* 2026-01-06 01:04:57.910788 | orchestrator | 2026-01-06 01:04:57.910794 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910801 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:00.077) 0:01:06.591 ******* 2026-01-06 01:04:57.910807 | orchestrator | 2026-01-06 01:04:57.910813 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910819 | orchestrator | Tuesday 06 January 2026 01:03:17 +0000 (0:00:00.065) 0:01:06.656 ******* 2026-01-06 01:04:57.910825 | orchestrator | 2026-01-06 01:04:57.910832 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-01-06 01:04:57.910838 | orchestrator | Tuesday 06 January 2026 01:03:18 +0000 (0:00:00.285) 0:01:06.942 ******* 2026-01-06 01:04:57.910844 | orchestrator | 2026-01-06 01:04:57.910850 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-server container] ************* 2026-01-06 01:04:57.910857 | orchestrator | Tuesday 06 January 2026 01:03:18 +0000 (0:00:00.083) 0:01:07.026 ******* 2026-01-06 01:04:57.910863 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.910870 | orchestrator | 2026-01-06 01:04:57.910876 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-node-exporter container] ****** 2026-01-06 01:04:57.910883 | orchestrator | Tuesday 06 January 2026 01:03:40 +0000 (0:00:22.678) 0:01:29.705 ******* 2026-01-06 01:04:57.910893 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.910899 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.910906 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.910912 | orchestrator | changed: [testbed-node-4] 2026-01-06 01:04:57.910921 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.910928 | orchestrator | changed: [testbed-node-3] 2026-01-06 01:04:57.910934 | orchestrator | changed: [testbed-node-5] 2026-01-06 01:04:57.910940 | orchestrator | 2026-01-06 01:04:57.910947 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-mysqld-exporter container] **** 2026-01-06 01:04:57.910954 | orchestrator | Tuesday 06 January 2026 01:03:54 +0000 (0:00:13.376) 0:01:43.081 ******* 2026-01-06 01:04:57.910960 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.910966 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.910972 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.910979 | orchestrator | 2026-01-06 01:04:57.910985 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-memcached-exporter container] *** 2026-01-06 01:04:57.910991 | orchestrator | Tuesday 06 January 2026 01:04:04 +0000 (0:00:10.640) 0:01:53.722 ******* 2026-01-06 01:04:57.910997 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.911004 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.911010 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.911016 | orchestrator | 2026-01-06 01:04:57.911023 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-cadvisor container] *********** 2026-01-06 01:04:57.911029 | orchestrator | Tuesday 06 January 2026 01:04:10 +0000 (0:00:05.167) 0:01:58.889 ******* 2026-01-06 01:04:57.911058 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.911064 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.911070 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.911077 | orchestrator | changed: [testbed-node-3] 2026-01-06 01:04:57.911083 | orchestrator | changed: [testbed-node-5] 2026-01-06 01:04:57.911089 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.911095 | orchestrator | changed: [testbed-node-4] 2026-01-06 01:04:57.911101 | orchestrator | 2026-01-06 01:04:57.911107 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-alertmanager container] ******* 2026-01-06 01:04:57.911114 | orchestrator | Tuesday 06 January 2026 01:04:24 +0000 (0:00:14.560) 0:02:13.450 ******* 2026-01-06 01:04:57.911120 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.911126 | orchestrator | 2026-01-06 01:04:57.911133 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-elasticsearch-exporter container] *** 2026-01-06 01:04:57.911139 | orchestrator | Tuesday 06 January 2026 01:04:33 +0000 (0:00:09.058) 0:02:22.509 ******* 2026-01-06 01:04:57.911145 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:04:57.911152 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:04:57.911158 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:04:57.911164 | orchestrator | 2026-01-06 01:04:57.911170 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-blackbox-exporter container] *** 2026-01-06 01:04:57.911176 | orchestrator | Tuesday 06 January 2026 01:04:38 +0000 (0:00:05.043) 0:02:27.553 ******* 2026-01-06 01:04:57.911182 | orchestrator | changed: [testbed-manager] 2026-01-06 01:04:57.911189 | orchestrator | 2026-01-06 01:04:57.911195 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-libvirt-exporter container] *** 2026-01-06 01:04:57.911201 | orchestrator | Tuesday 06 January 2026 01:04:44 +0000 (0:00:05.584) 0:02:33.137 ******* 2026-01-06 01:04:57.911207 | orchestrator | changed: [testbed-node-4] 2026-01-06 01:04:57.911213 | orchestrator | changed: [testbed-node-3] 2026-01-06 01:04:57.911220 | orchestrator | changed: [testbed-node-5] 2026-01-06 01:04:57.911226 | orchestrator | 2026-01-06 01:04:57.911232 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:04:57.911239 | orchestrator | testbed-manager : ok=23  changed=14  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2026-01-06 01:04:57.911245 | orchestrator | testbed-node-0 : ok=16  changed=11  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2026-01-06 01:04:57.911258 | orchestrator | testbed-node-1 : ok=16  changed=11  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2026-01-06 01:04:57.911264 | orchestrator | testbed-node-2 : ok=16  changed=11  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2026-01-06 01:04:57.911275 | orchestrator | testbed-node-3 : ok=13  changed=8  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2026-01-06 01:04:57.911281 | orchestrator | testbed-node-4 : ok=13  changed=8  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2026-01-06 01:04:57.911288 | orchestrator | testbed-node-5 : ok=13  changed=8  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2026-01-06 01:04:57.911294 | orchestrator | 2026-01-06 01:04:57.911300 | orchestrator | 2026-01-06 01:04:57.911307 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:04:57.911313 | orchestrator | Tuesday 06 January 2026 01:04:54 +0000 (0:00:10.398) 0:02:43.536 ******* 2026-01-06 01:04:57.911319 | orchestrator | =============================================================================== 2026-01-06 01:04:57.911325 | orchestrator | prometheus : Restart prometheus-server container ----------------------- 22.68s 2026-01-06 01:04:57.911332 | orchestrator | prometheus : Copying over prometheus config file ----------------------- 14.84s 2026-01-06 01:04:57.911338 | orchestrator | prometheus : Restart prometheus-cadvisor container --------------------- 14.56s 2026-01-06 01:04:57.911344 | orchestrator | prometheus : Restart prometheus-node-exporter container ---------------- 13.38s 2026-01-06 01:04:57.911350 | orchestrator | prometheus : Restart prometheus-mysqld-exporter container -------------- 10.64s 2026-01-06 01:04:57.911357 | orchestrator | prometheus : Restart prometheus-libvirt-exporter container ------------- 10.40s 2026-01-06 01:04:57.911363 | orchestrator | prometheus : Restart prometheus-alertmanager container ------------------ 9.06s 2026-01-06 01:04:57.911373 | orchestrator | service-cert-copy : prometheus | Copying over extra CA certificates ----- 5.94s 2026-01-06 01:04:57.911379 | orchestrator | prometheus : Copying over config.json files ----------------------------- 5.68s 2026-01-06 01:04:57.911385 | orchestrator | prometheus : Restart prometheus-blackbox-exporter container ------------- 5.58s 2026-01-06 01:04:57.911392 | orchestrator | prometheus : Restart prometheus-memcached-exporter container ------------ 5.17s 2026-01-06 01:04:57.911398 | orchestrator | prometheus : Restart prometheus-elasticsearch-exporter container -------- 5.04s 2026-01-06 01:04:57.911404 | orchestrator | service-check-containers : prometheus | Check containers ---------------- 4.51s 2026-01-06 01:04:57.911411 | orchestrator | prometheus : Copying over prometheus web config file -------------------- 3.41s 2026-01-06 01:04:57.911417 | orchestrator | prometheus : Ensuring config directories exist -------------------------- 2.74s 2026-01-06 01:04:57.911423 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS key --- 2.67s 2026-01-06 01:04:57.911429 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS certificate --- 2.37s 2026-01-06 01:04:57.911436 | orchestrator | prometheus : Copying over my.cnf for mysqld_exporter -------------------- 2.17s 2026-01-06 01:04:57.911442 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.97s 2026-01-06 01:04:57.911448 | orchestrator | prometheus : Copying config file for blackbox exporter ------------------ 1.70s 2026-01-06 01:04:57.911454 | orchestrator | 2026-01-06 01:04:57 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:04:57.911461 | orchestrator | 2026-01-06 01:04:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:00.961573 | orchestrator | 2026-01-06 01:05:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:00.962354 | orchestrator | 2026-01-06 01:05:00 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:00.964319 | orchestrator | 2026-01-06 01:05:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:00.965681 | orchestrator | 2026-01-06 01:05:00 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:00.965729 | orchestrator | 2026-01-06 01:05:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:04.027699 | orchestrator | 2026-01-06 01:05:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:04.027786 | orchestrator | 2026-01-06 01:05:04 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:04.030800 | orchestrator | 2026-01-06 01:05:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:04.032820 | orchestrator | 2026-01-06 01:05:04 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:04.033449 | orchestrator | 2026-01-06 01:05:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:07.074386 | orchestrator | 2026-01-06 01:05:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:07.078469 | orchestrator | 2026-01-06 01:05:07 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:07.079910 | orchestrator | 2026-01-06 01:05:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:07.081973 | orchestrator | 2026-01-06 01:05:07 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:07.082163 | orchestrator | 2026-01-06 01:05:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:10.122007 | orchestrator | 2026-01-06 01:05:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:10.124187 | orchestrator | 2026-01-06 01:05:10 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:10.126309 | orchestrator | 2026-01-06 01:05:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:10.127973 | orchestrator | 2026-01-06 01:05:10 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:10.128083 | orchestrator | 2026-01-06 01:05:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:13.174291 | orchestrator | 2026-01-06 01:05:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:13.176052 | orchestrator | 2026-01-06 01:05:13 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:13.177928 | orchestrator | 2026-01-06 01:05:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:13.178947 | orchestrator | 2026-01-06 01:05:13 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:13.179002 | orchestrator | 2026-01-06 01:05:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:16.223358 | orchestrator | 2026-01-06 01:05:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:16.230690 | orchestrator | 2026-01-06 01:05:16 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:16.233443 | orchestrator | 2026-01-06 01:05:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:16.235903 | orchestrator | 2026-01-06 01:05:16 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:16.235950 | orchestrator | 2026-01-06 01:05:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:19.287339 | orchestrator | 2026-01-06 01:05:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:19.289391 | orchestrator | 2026-01-06 01:05:19 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:19.290983 | orchestrator | 2026-01-06 01:05:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:19.292658 | orchestrator | 2026-01-06 01:05:19 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:19.292697 | orchestrator | 2026-01-06 01:05:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:22.339479 | orchestrator | 2026-01-06 01:05:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:22.340479 | orchestrator | 2026-01-06 01:05:22 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:22.343121 | orchestrator | 2026-01-06 01:05:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:22.345561 | orchestrator | 2026-01-06 01:05:22 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:22.345611 | orchestrator | 2026-01-06 01:05:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:25.388621 | orchestrator | 2026-01-06 01:05:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:25.391115 | orchestrator | 2026-01-06 01:05:25 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:25.393492 | orchestrator | 2026-01-06 01:05:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:25.396920 | orchestrator | 2026-01-06 01:05:25 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:25.397241 | orchestrator | 2026-01-06 01:05:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:28.440356 | orchestrator | 2026-01-06 01:05:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:28.440476 | orchestrator | 2026-01-06 01:05:28 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:28.441290 | orchestrator | 2026-01-06 01:05:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:28.442643 | orchestrator | 2026-01-06 01:05:28 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:28.442702 | orchestrator | 2026-01-06 01:05:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:31.492080 | orchestrator | 2026-01-06 01:05:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:31.494753 | orchestrator | 2026-01-06 01:05:31 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:31.496570 | orchestrator | 2026-01-06 01:05:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:31.498171 | orchestrator | 2026-01-06 01:05:31 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:31.498327 | orchestrator | 2026-01-06 01:05:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:34.548845 | orchestrator | 2026-01-06 01:05:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:34.554298 | orchestrator | 2026-01-06 01:05:34 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:34.557477 | orchestrator | 2026-01-06 01:05:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:34.559519 | orchestrator | 2026-01-06 01:05:34 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:34.559621 | orchestrator | 2026-01-06 01:05:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:37.605106 | orchestrator | 2026-01-06 01:05:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:37.606919 | orchestrator | 2026-01-06 01:05:37 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:37.608723 | orchestrator | 2026-01-06 01:05:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:37.610363 | orchestrator | 2026-01-06 01:05:37 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:37.610402 | orchestrator | 2026-01-06 01:05:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:40.655839 | orchestrator | 2026-01-06 01:05:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:40.657773 | orchestrator | 2026-01-06 01:05:40 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:40.659904 | orchestrator | 2026-01-06 01:05:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:40.662410 | orchestrator | 2026-01-06 01:05:40 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:40.662465 | orchestrator | 2026-01-06 01:05:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:43.712732 | orchestrator | 2026-01-06 01:05:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:43.714623 | orchestrator | 2026-01-06 01:05:43 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:43.716270 | orchestrator | 2026-01-06 01:05:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:43.718147 | orchestrator | 2026-01-06 01:05:43 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:43.718290 | orchestrator | 2026-01-06 01:05:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:46.766883 | orchestrator | 2026-01-06 01:05:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:46.768401 | orchestrator | 2026-01-06 01:05:46 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:46.771266 | orchestrator | 2026-01-06 01:05:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:46.772791 | orchestrator | 2026-01-06 01:05:46 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:46.773027 | orchestrator | 2026-01-06 01:05:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:49.825866 | orchestrator | 2026-01-06 01:05:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:49.828216 | orchestrator | 2026-01-06 01:05:49 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:49.830550 | orchestrator | 2026-01-06 01:05:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:49.832656 | orchestrator | 2026-01-06 01:05:49 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:49.832737 | orchestrator | 2026-01-06 01:05:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:52.878161 | orchestrator | 2026-01-06 01:05:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:52.879675 | orchestrator | 2026-01-06 01:05:52 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:52.882225 | orchestrator | 2026-01-06 01:05:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:52.884278 | orchestrator | 2026-01-06 01:05:52 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:52.884326 | orchestrator | 2026-01-06 01:05:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:55.935967 | orchestrator | 2026-01-06 01:05:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:55.938228 | orchestrator | 2026-01-06 01:05:55 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:55.940028 | orchestrator | 2026-01-06 01:05:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:55.942320 | orchestrator | 2026-01-06 01:05:55 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:55.942369 | orchestrator | 2026-01-06 01:05:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:05:58.991048 | orchestrator | 2026-01-06 01:05:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:05:58.992037 | orchestrator | 2026-01-06 01:05:58 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:05:58.993219 | orchestrator | 2026-01-06 01:05:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:05:58.994196 | orchestrator | 2026-01-06 01:05:58 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:05:58.994285 | orchestrator | 2026-01-06 01:05:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:02.045466 | orchestrator | 2026-01-06 01:06:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:02.046264 | orchestrator | 2026-01-06 01:06:02 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:02.047816 | orchestrator | 2026-01-06 01:06:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:02.049841 | orchestrator | 2026-01-06 01:06:02 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:02.049919 | orchestrator | 2026-01-06 01:06:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:05.095660 | orchestrator | 2026-01-06 01:06:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:05.097093 | orchestrator | 2026-01-06 01:06:05 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:05.098522 | orchestrator | 2026-01-06 01:06:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:05.100134 | orchestrator | 2026-01-06 01:06:05 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:05.100287 | orchestrator | 2026-01-06 01:06:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:08.147304 | orchestrator | 2026-01-06 01:06:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:08.149325 | orchestrator | 2026-01-06 01:06:08 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:08.151275 | orchestrator | 2026-01-06 01:06:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:08.153583 | orchestrator | 2026-01-06 01:06:08 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:08.153656 | orchestrator | 2026-01-06 01:06:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:11.202746 | orchestrator | 2026-01-06 01:06:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:11.204041 | orchestrator | 2026-01-06 01:06:11 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:11.205179 | orchestrator | 2026-01-06 01:06:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:11.207551 | orchestrator | 2026-01-06 01:06:11 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:11.207657 | orchestrator | 2026-01-06 01:06:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:14.260022 | orchestrator | 2026-01-06 01:06:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:14.262554 | orchestrator | 2026-01-06 01:06:14 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:14.264607 | orchestrator | 2026-01-06 01:06:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:14.267028 | orchestrator | 2026-01-06 01:06:14 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:14.267193 | orchestrator | 2026-01-06 01:06:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:17.309260 | orchestrator | 2026-01-06 01:06:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:17.311520 | orchestrator | 2026-01-06 01:06:17 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:17.314818 | orchestrator | 2026-01-06 01:06:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:17.316819 | orchestrator | 2026-01-06 01:06:17 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:17.316868 | orchestrator | 2026-01-06 01:06:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:20.361376 | orchestrator | 2026-01-06 01:06:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:20.362639 | orchestrator | 2026-01-06 01:06:20 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:20.364902 | orchestrator | 2026-01-06 01:06:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:20.365809 | orchestrator | 2026-01-06 01:06:20 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:20.365944 | orchestrator | 2026-01-06 01:06:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:23.411795 | orchestrator | 2026-01-06 01:06:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:23.415363 | orchestrator | 2026-01-06 01:06:23 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:23.417214 | orchestrator | 2026-01-06 01:06:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:23.419514 | orchestrator | 2026-01-06 01:06:23 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:23.419622 | orchestrator | 2026-01-06 01:06:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:26.467856 | orchestrator | 2026-01-06 01:06:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:26.469353 | orchestrator | 2026-01-06 01:06:26 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:26.472800 | orchestrator | 2026-01-06 01:06:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:26.474128 | orchestrator | 2026-01-06 01:06:26 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:26.474218 | orchestrator | 2026-01-06 01:06:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:29.520351 | orchestrator | 2026-01-06 01:06:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:29.521174 | orchestrator | 2026-01-06 01:06:29 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:29.521298 | orchestrator | 2026-01-06 01:06:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:29.522011 | orchestrator | 2026-01-06 01:06:29 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:29.522132 | orchestrator | 2026-01-06 01:06:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:32.570836 | orchestrator | 2026-01-06 01:06:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:32.572180 | orchestrator | 2026-01-06 01:06:32 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:32.573786 | orchestrator | 2026-01-06 01:06:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:32.575519 | orchestrator | 2026-01-06 01:06:32 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:32.575558 | orchestrator | 2026-01-06 01:06:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:35.625682 | orchestrator | 2026-01-06 01:06:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:35.627123 | orchestrator | 2026-01-06 01:06:35 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:35.629719 | orchestrator | 2026-01-06 01:06:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:35.631681 | orchestrator | 2026-01-06 01:06:35 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:35.631769 | orchestrator | 2026-01-06 01:06:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:38.674777 | orchestrator | 2026-01-06 01:06:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:38.676136 | orchestrator | 2026-01-06 01:06:38 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:38.678279 | orchestrator | 2026-01-06 01:06:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:38.680230 | orchestrator | 2026-01-06 01:06:38 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:38.680282 | orchestrator | 2026-01-06 01:06:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:41.728295 | orchestrator | 2026-01-06 01:06:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:41.730924 | orchestrator | 2026-01-06 01:06:41 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:41.733221 | orchestrator | 2026-01-06 01:06:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:41.734563 | orchestrator | 2026-01-06 01:06:41 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:41.734605 | orchestrator | 2026-01-06 01:06:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:44.782890 | orchestrator | 2026-01-06 01:06:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:44.784811 | orchestrator | 2026-01-06 01:06:44 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:44.786687 | orchestrator | 2026-01-06 01:06:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:44.788448 | orchestrator | 2026-01-06 01:06:44 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:44.788572 | orchestrator | 2026-01-06 01:06:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:47.829575 | orchestrator | 2026-01-06 01:06:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:47.831431 | orchestrator | 2026-01-06 01:06:47 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:47.833213 | orchestrator | 2026-01-06 01:06:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:47.835086 | orchestrator | 2026-01-06 01:06:47 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:47.835259 | orchestrator | 2026-01-06 01:06:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:50.889574 | orchestrator | 2026-01-06 01:06:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:50.891296 | orchestrator | 2026-01-06 01:06:50 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:50.893316 | orchestrator | 2026-01-06 01:06:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:50.895408 | orchestrator | 2026-01-06 01:06:50 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:50.895485 | orchestrator | 2026-01-06 01:06:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:53.940676 | orchestrator | 2026-01-06 01:06:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:53.942128 | orchestrator | 2026-01-06 01:06:53 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:53.943663 | orchestrator | 2026-01-06 01:06:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:53.945488 | orchestrator | 2026-01-06 01:06:53 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:53.945539 | orchestrator | 2026-01-06 01:06:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:06:56.994170 | orchestrator | 2026-01-06 01:06:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:06:56.997562 | orchestrator | 2026-01-06 01:06:56 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:06:56.999831 | orchestrator | 2026-01-06 01:06:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:06:57.002541 | orchestrator | 2026-01-06 01:06:57 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:06:57.002593 | orchestrator | 2026-01-06 01:06:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:00.048369 | orchestrator | 2026-01-06 01:07:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:00.050987 | orchestrator | 2026-01-06 01:07:00 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:00.051807 | orchestrator | 2026-01-06 01:07:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:00.054206 | orchestrator | 2026-01-06 01:07:00 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:07:00.054263 | orchestrator | 2026-01-06 01:07:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:03.103683 | orchestrator | 2026-01-06 01:07:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:03.105299 | orchestrator | 2026-01-06 01:07:03 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:03.107711 | orchestrator | 2026-01-06 01:07:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:03.112185 | orchestrator | 2026-01-06 01:07:03 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state STARTED 2026-01-06 01:07:03.112289 | orchestrator | 2026-01-06 01:07:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:06.156114 | orchestrator | 2026-01-06 01:07:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:06.158475 | orchestrator | 2026-01-06 01:07:06 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:06.159995 | orchestrator | 2026-01-06 01:07:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:06.162518 | orchestrator | 2026-01-06 01:07:06 | INFO  | Task 3789ba16-e0b0-4215-8f8c-fbd78fd28ebf is in state SUCCESS 2026-01-06 01:07:06.162598 | orchestrator | 2026-01-06 01:07:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:06.164438 | orchestrator | 2026-01-06 01:07:06.164521 | orchestrator | 2026-01-06 01:07:06.164545 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-01-06 01:07:06.164558 | orchestrator | 2026-01-06 01:07:06.164570 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-01-06 01:07:06.164582 | orchestrator | Tuesday 06 January 2026 01:04:59 +0000 (0:00:00.284) 0:00:00.284 ******* 2026-01-06 01:07:06.164593 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:07:06.164605 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:07:06.164616 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:07:06.164627 | orchestrator | 2026-01-06 01:07:06.164638 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-01-06 01:07:06.164649 | orchestrator | Tuesday 06 January 2026 01:04:59 +0000 (0:00:00.305) 0:00:00.589 ******* 2026-01-06 01:07:06.164660 | orchestrator | ok: [testbed-node-0] => (item=enable_grafana_True) 2026-01-06 01:07:06.164672 | orchestrator | ok: [testbed-node-1] => (item=enable_grafana_True) 2026-01-06 01:07:06.164682 | orchestrator | ok: [testbed-node-2] => (item=enable_grafana_True) 2026-01-06 01:07:06.164693 | orchestrator | 2026-01-06 01:07:06.164704 | orchestrator | PLAY [Apply role grafana] ****************************************************** 2026-01-06 01:07:06.164715 | orchestrator | 2026-01-06 01:07:06.164726 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-01-06 01:07:06.164737 | orchestrator | Tuesday 06 January 2026 01:05:00 +0000 (0:00:00.428) 0:00:01.018 ******* 2026-01-06 01:07:06.164748 | orchestrator | included: /ansible/roles/grafana/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:07:06.164760 | orchestrator | 2026-01-06 01:07:06.164771 | orchestrator | TASK [grafana : Ensuring config directories exist] ***************************** 2026-01-06 01:07:06.164782 | orchestrator | Tuesday 06 January 2026 01:05:00 +0000 (0:00:00.532) 0:00:01.550 ******* 2026-01-06 01:07:06.164797 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.164840 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.164870 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.164882 | orchestrator | 2026-01-06 01:07:06.164894 | orchestrator | TASK [grafana : Check if extra configuration file exists] ********************** 2026-01-06 01:07:06.164905 | orchestrator | Tuesday 06 January 2026 01:05:01 +0000 (0:00:00.756) 0:00:02.307 ******* 2026-01-06 01:07:06.164952 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 01:07:06.164965 | orchestrator | 2026-01-06 01:07:06.164976 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-01-06 01:07:06.164989 | orchestrator | Tuesday 06 January 2026 01:05:02 +0000 (0:00:00.830) 0:00:03.138 ******* 2026-01-06 01:07:06.165003 | orchestrator | included: /ansible/roles/grafana/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-01-06 01:07:06.165017 | orchestrator | 2026-01-06 01:07:06.165031 | orchestrator | TASK [service-cert-copy : grafana | Copying over extra CA certificates] ******** 2026-01-06 01:07:06.165060 | orchestrator | Tuesday 06 January 2026 01:05:03 +0000 (0:00:00.743) 0:00:03.881 ******* 2026-01-06 01:07:06.165075 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165089 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165111 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165125 | orchestrator | 2026-01-06 01:07:06.165138 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS certificate] *** 2026-01-06 01:07:06.165151 | orchestrator | Tuesday 06 January 2026 01:05:04 +0000 (0:00:01.365) 0:00:05.247 ******* 2026-01-06 01:07:06.165165 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165179 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.165200 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165214 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.165236 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165250 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.165264 | orchestrator | 2026-01-06 01:07:06.165278 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS key] ***** 2026-01-06 01:07:06.165290 | orchestrator | Tuesday 06 January 2026 01:05:04 +0000 (0:00:00.493) 0:00:05.740 ******* 2026-01-06 01:07:06.165302 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165321 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.165332 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165343 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.165355 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.165366 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.165377 | orchestrator | 2026-01-06 01:07:06.165393 | orchestrator | TASK [grafana : Copying over config.json files] ******************************** 2026-01-06 01:07:06.165405 | orchestrator | Tuesday 06 January 2026 01:05:05 +0000 (0:00:00.918) 0:00:06.659 ******* 2026-01-06 01:07:06.165424 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165436 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165454 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165466 | orchestrator | 2026-01-06 01:07:06.165477 | orchestrator | TASK [grafana : Copying over grafana.ini] ************************************** 2026-01-06 01:07:06.165488 | orchestrator | Tuesday 06 January 2026 01:05:07 +0000 (0:00:01.330) 0:00:07.990 ******* 2026-01-06 01:07:06.165500 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165511 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165528 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.165540 | orchestrator | 2026-01-06 01:07:06.165551 | orchestrator | TASK [grafana : Copying over extra configuration file] ************************* 2026-01-06 01:07:06.165567 | orchestrator | Tuesday 06 January 2026 01:05:08 +0000 (0:00:01.378) 0:00:09.368 ******* 2026-01-06 01:07:06.165579 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.165590 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.165601 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.165612 | orchestrator | 2026-01-06 01:07:06.165623 | orchestrator | TASK [grafana : Configuring Prometheus as data source for Grafana] ************* 2026-01-06 01:07:06.165634 | orchestrator | Tuesday 06 January 2026 01:05:09 +0000 (0:00:00.480) 0:00:09.849 ******* 2026-01-06 01:07:06.165645 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-01-06 01:07:06.165663 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-01-06 01:07:06.165674 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-01-06 01:07:06.165685 | orchestrator | 2026-01-06 01:07:06.165696 | orchestrator | TASK [grafana : Configuring dashboards provisioning] *************************** 2026-01-06 01:07:06.165707 | orchestrator | Tuesday 06 January 2026 01:05:10 +0000 (0:00:01.314) 0:00:11.163 ******* 2026-01-06 01:07:06.165718 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-01-06 01:07:06.165729 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-01-06 01:07:06.165740 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-01-06 01:07:06.165752 | orchestrator | 2026-01-06 01:07:06.165762 | orchestrator | TASK [grafana : Check if the folder for custom grafana dashboards exists] ****** 2026-01-06 01:07:06.165773 | orchestrator | Tuesday 06 January 2026 01:05:11 +0000 (0:00:01.261) 0:00:12.424 ******* 2026-01-06 01:07:06.165784 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-01-06 01:07:06.165795 | orchestrator | 2026-01-06 01:07:06.165806 | orchestrator | TASK [grafana : Remove templated Grafana dashboards] *************************** 2026-01-06 01:07:06.165817 | orchestrator | Tuesday 06 January 2026 01:05:12 +0000 (0:00:00.726) 0:00:13.151 ******* 2026-01-06 01:07:06.165828 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:07:06.165839 | orchestrator | ok: [testbed-node-1] 2026-01-06 01:07:06.165850 | orchestrator | ok: [testbed-node-2] 2026-01-06 01:07:06.165861 | orchestrator | 2026-01-06 01:07:06.165872 | orchestrator | TASK [grafana : Copying over custom dashboards] ******************************** 2026-01-06 01:07:06.165883 | orchestrator | Tuesday 06 January 2026 01:05:13 +0000 (0:00:00.737) 0:00:13.888 ******* 2026-01-06 01:07:06.165894 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:07:06.165905 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:07:06.165941 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:07:06.165952 | orchestrator | 2026-01-06 01:07:06.165963 | orchestrator | TASK [service-check-containers : grafana | Check containers] ******************* 2026-01-06 01:07:06.165974 | orchestrator | Tuesday 06 January 2026 01:05:14 +0000 (0:00:01.433) 0:00:15.321 ******* 2026-01-06 01:07:06.165985 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.166002 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.166088 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-01-06 01:07:06.166104 | orchestrator | 2026-01-06 01:07:06.166115 | orchestrator | TASK [service-check-containers : grafana | Notify handlers to restart containers] *** 2026-01-06 01:07:06.166126 | orchestrator | Tuesday 06 January 2026 01:05:15 +0000 (0:00:01.206) 0:00:16.528 ******* 2026-01-06 01:07:06.166148 | orchestrator | changed: [testbed-node-0] => { 2026-01-06 01:07:06.166159 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:07:06.166171 | orchestrator | } 2026-01-06 01:07:06.166182 | orchestrator | changed: [testbed-node-1] => { 2026-01-06 01:07:06.166193 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:07:06.166204 | orchestrator | } 2026-01-06 01:07:06.166215 | orchestrator | changed: [testbed-node-2] => { 2026-01-06 01:07:06.166226 | orchestrator |  "msg": "Notifying handlers" 2026-01-06 01:07:06.166237 | orchestrator | } 2026-01-06 01:07:06.166248 | orchestrator | 2026-01-06 01:07:06.166258 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-01-06 01:07:06.166270 | orchestrator | Tuesday 06 January 2026 01:05:16 +0000 (0:00:00.327) 0:00:16.855 ******* 2026-01-06 01:07:06.166281 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.166292 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.166304 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.166315 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.166327 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2025.1', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-01-06 01:07:06.166346 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.166357 | orchestrator | 2026-01-06 01:07:06.166373 | orchestrator | TASK [grafana : Creating grafana database] ************************************* 2026-01-06 01:07:06.166384 | orchestrator | Tuesday 06 January 2026 01:05:16 +0000 (0:00:00.730) 0:00:17.585 ******* 2026-01-06 01:07:06.166395 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:07:06.166406 | orchestrator | 2026-01-06 01:07:06.166417 | orchestrator | TASK [grafana : Creating grafana database user and setting permissions] ******** 2026-01-06 01:07:06.166428 | orchestrator | Tuesday 06 January 2026 01:05:19 +0000 (0:00:02.326) 0:00:19.912 ******* 2026-01-06 01:07:06.166439 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:07:06.166450 | orchestrator | 2026-01-06 01:07:06.166461 | orchestrator | TASK [grafana : Flush handlers] ************************************************ 2026-01-06 01:07:06.166472 | orchestrator | Tuesday 06 January 2026 01:05:21 +0000 (0:00:02.228) 0:00:22.140 ******* 2026-01-06 01:07:06.166483 | orchestrator | 2026-01-06 01:07:06.166494 | orchestrator | TASK [grafana : Flush handlers] ************************************************ 2026-01-06 01:07:06.166505 | orchestrator | Tuesday 06 January 2026 01:05:21 +0000 (0:00:00.066) 0:00:22.207 ******* 2026-01-06 01:07:06.166516 | orchestrator | 2026-01-06 01:07:06.166527 | orchestrator | TASK [grafana : Flush handlers] ************************************************ 2026-01-06 01:07:06.166543 | orchestrator | Tuesday 06 January 2026 01:05:21 +0000 (0:00:00.062) 0:00:22.269 ******* 2026-01-06 01:07:06.166554 | orchestrator | 2026-01-06 01:07:06.166566 | orchestrator | RUNNING HANDLER [grafana : Restart first grafana container] ******************** 2026-01-06 01:07:06.166576 | orchestrator | Tuesday 06 January 2026 01:05:21 +0000 (0:00:00.068) 0:00:22.338 ******* 2026-01-06 01:07:06.166587 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.166598 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.166609 | orchestrator | changed: [testbed-node-0] 2026-01-06 01:07:06.166620 | orchestrator | 2026-01-06 01:07:06.166631 | orchestrator | RUNNING HANDLER [grafana : Waiting for grafana to start on first node] ********* 2026-01-06 01:07:06.166642 | orchestrator | Tuesday 06 January 2026 01:05:28 +0000 (0:00:06.753) 0:00:29.091 ******* 2026-01-06 01:07:06.166653 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.166664 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.166675 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for grafana to start on first node (12 retries left). 2026-01-06 01:07:06.166686 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for grafana to start on first node (11 retries left). 2026-01-06 01:07:06.166697 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for grafana to start on first node (10 retries left). 2026-01-06 01:07:06.166709 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for grafana to start on first node (9 retries left). 2026-01-06 01:07:06.166720 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:07:06.166731 | orchestrator | 2026-01-06 01:07:06.166741 | orchestrator | RUNNING HANDLER [grafana : Restart remaining grafana containers] *************** 2026-01-06 01:07:06.166752 | orchestrator | Tuesday 06 January 2026 01:06:18 +0000 (0:00:50.584) 0:01:19.676 ******* 2026-01-06 01:07:06.166768 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.166787 | orchestrator | changed: [testbed-node-2] 2026-01-06 01:07:06.166805 | orchestrator | changed: [testbed-node-1] 2026-01-06 01:07:06.166824 | orchestrator | 2026-01-06 01:07:06.166843 | orchestrator | TASK [grafana : Wait for grafana application ready] **************************** 2026-01-06 01:07:06.166862 | orchestrator | Tuesday 06 January 2026 01:06:57 +0000 (0:00:39.010) 0:01:58.686 ******* 2026-01-06 01:07:06.166879 | orchestrator | ok: [testbed-node-0] 2026-01-06 01:07:06.166890 | orchestrator | 2026-01-06 01:07:06.166901 | orchestrator | TASK [grafana : Remove old grafana docker volume] ****************************** 2026-01-06 01:07:06.166970 | orchestrator | Tuesday 06 January 2026 01:07:00 +0000 (0:00:02.151) 0:02:00.838 ******* 2026-01-06 01:07:06.166993 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.167005 | orchestrator | skipping: [testbed-node-1] 2026-01-06 01:07:06.167016 | orchestrator | skipping: [testbed-node-2] 2026-01-06 01:07:06.167027 | orchestrator | 2026-01-06 01:07:06.167037 | orchestrator | TASK [grafana : Enable grafana datasources] ************************************ 2026-01-06 01:07:06.167048 | orchestrator | Tuesday 06 January 2026 01:07:00 +0000 (0:00:00.320) 0:02:01.159 ******* 2026-01-06 01:07:06.167061 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'influxdb', 'value': {'enabled': False, 'data': {'isDefault': True, 'database': 'telegraf', 'name': 'telegraf', 'type': 'influxdb', 'url': 'https://api-int.testbed.osism.xyz:8086', 'access': 'proxy', 'basicAuth': False}}})  2026-01-06 01:07:06.167075 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'data': {'name': 'opensearch', 'type': 'grafana-opensearch-datasource', 'access': 'proxy', 'url': 'https://api-int.testbed.osism.xyz:9200', 'jsonData': {'flavor': 'OpenSearch', 'database': 'flog-*', 'version': '2.11.1', 'timeField': '@timestamp', 'logLevelField': 'log_level'}}}}) 2026-01-06 01:07:06.167089 | orchestrator | 2026-01-06 01:07:06.167100 | orchestrator | TASK [grafana : Disable Getting Started panel] ********************************* 2026-01-06 01:07:06.167111 | orchestrator | Tuesday 06 January 2026 01:07:02 +0000 (0:00:02.338) 0:02:03.497 ******* 2026-01-06 01:07:06.167122 | orchestrator | skipping: [testbed-node-0] 2026-01-06 01:07:06.167133 | orchestrator | 2026-01-06 01:07:06.167144 | orchestrator | PLAY RECAP ********************************************************************* 2026-01-06 01:07:06.167156 | orchestrator | testbed-node-0 : ok=22  changed=13  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 01:07:06.167168 | orchestrator | testbed-node-1 : ok=15  changed=10  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 01:07:06.167191 | orchestrator | testbed-node-2 : ok=15  changed=10  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-01-06 01:07:06.167203 | orchestrator | 2026-01-06 01:07:06.167214 | orchestrator | 2026-01-06 01:07:06.167225 | orchestrator | TASKS RECAP ******************************************************************** 2026-01-06 01:07:06.167236 | orchestrator | Tuesday 06 January 2026 01:07:02 +0000 (0:00:00.271) 0:02:03.769 ******* 2026-01-06 01:07:06.167247 | orchestrator | =============================================================================== 2026-01-06 01:07:06.167258 | orchestrator | grafana : Waiting for grafana to start on first node ------------------- 50.58s 2026-01-06 01:07:06.167270 | orchestrator | grafana : Restart remaining grafana containers ------------------------- 39.01s 2026-01-06 01:07:06.167280 | orchestrator | grafana : Restart first grafana container ------------------------------- 6.75s 2026-01-06 01:07:06.167291 | orchestrator | grafana : Enable grafana datasources ------------------------------------ 2.34s 2026-01-06 01:07:06.167302 | orchestrator | grafana : Creating grafana database ------------------------------------- 2.33s 2026-01-06 01:07:06.167313 | orchestrator | grafana : Creating grafana database user and setting permissions -------- 2.23s 2026-01-06 01:07:06.167331 | orchestrator | grafana : Wait for grafana application ready ---------------------------- 2.15s 2026-01-06 01:07:06.167342 | orchestrator | grafana : Copying over custom dashboards -------------------------------- 1.43s 2026-01-06 01:07:06.167353 | orchestrator | grafana : Copying over grafana.ini -------------------------------------- 1.38s 2026-01-06 01:07:06.167364 | orchestrator | service-cert-copy : grafana | Copying over extra CA certificates -------- 1.37s 2026-01-06 01:07:06.167375 | orchestrator | grafana : Copying over config.json files -------------------------------- 1.33s 2026-01-06 01:07:06.167386 | orchestrator | grafana : Configuring Prometheus as data source for Grafana ------------- 1.31s 2026-01-06 01:07:06.167397 | orchestrator | grafana : Configuring dashboards provisioning --------------------------- 1.26s 2026-01-06 01:07:06.167408 | orchestrator | service-check-containers : grafana | Check containers ------------------- 1.21s 2026-01-06 01:07:06.167426 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS key ----- 0.92s 2026-01-06 01:07:06.167437 | orchestrator | grafana : Check if extra configuration file exists ---------------------- 0.83s 2026-01-06 01:07:06.167448 | orchestrator | grafana : Ensuring config directories exist ----------------------------- 0.76s 2026-01-06 01:07:06.167459 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.74s 2026-01-06 01:07:06.167470 | orchestrator | grafana : Remove templated Grafana dashboards --------------------------- 0.74s 2026-01-06 01:07:06.167481 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.73s 2026-01-06 01:07:09.208875 | orchestrator | 2026-01-06 01:07:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:09.210692 | orchestrator | 2026-01-06 01:07:09 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:09.213101 | orchestrator | 2026-01-06 01:07:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:09.213153 | orchestrator | 2026-01-06 01:07:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:12.264373 | orchestrator | 2026-01-06 01:07:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:12.267010 | orchestrator | 2026-01-06 01:07:12 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:12.269005 | orchestrator | 2026-01-06 01:07:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:12.269063 | orchestrator | 2026-01-06 01:07:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:15.312400 | orchestrator | 2026-01-06 01:07:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:15.314108 | orchestrator | 2026-01-06 01:07:15 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:15.316031 | orchestrator | 2026-01-06 01:07:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:15.316073 | orchestrator | 2026-01-06 01:07:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:18.367081 | orchestrator | 2026-01-06 01:07:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:18.367894 | orchestrator | 2026-01-06 01:07:18 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:18.370693 | orchestrator | 2026-01-06 01:07:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:18.370750 | orchestrator | 2026-01-06 01:07:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:21.436017 | orchestrator | 2026-01-06 01:07:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:21.437560 | orchestrator | 2026-01-06 01:07:21 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:21.440892 | orchestrator | 2026-01-06 01:07:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:21.441114 | orchestrator | 2026-01-06 01:07:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:24.493285 | orchestrator | 2026-01-06 01:07:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:24.494691 | orchestrator | 2026-01-06 01:07:24 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:24.496413 | orchestrator | 2026-01-06 01:07:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:24.496455 | orchestrator | 2026-01-06 01:07:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:27.545039 | orchestrator | 2026-01-06 01:07:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:27.548705 | orchestrator | 2026-01-06 01:07:27 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:27.552285 | orchestrator | 2026-01-06 01:07:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:27.552364 | orchestrator | 2026-01-06 01:07:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:30.607378 | orchestrator | 2026-01-06 01:07:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:30.609728 | orchestrator | 2026-01-06 01:07:30 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:30.613438 | orchestrator | 2026-01-06 01:07:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:30.613517 | orchestrator | 2026-01-06 01:07:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:33.663563 | orchestrator | 2026-01-06 01:07:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:33.665028 | orchestrator | 2026-01-06 01:07:33 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state STARTED 2026-01-06 01:07:33.668112 | orchestrator | 2026-01-06 01:07:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:33.668210 | orchestrator | 2026-01-06 01:07:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:36.728220 | orchestrator | 2026-01-06 01:07:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:36.728873 | orchestrator | 2026-01-06 01:07:36 | INFO  | Task c416f343-888d-4fb1-a014-f1db6380800a is in state SUCCESS 2026-01-06 01:07:36.730659 | orchestrator | 2026-01-06 01:07:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:36.730685 | orchestrator | 2026-01-06 01:07:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:39.783827 | orchestrator | 2026-01-06 01:07:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:39.784692 | orchestrator | 2026-01-06 01:07:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:39.784780 | orchestrator | 2026-01-06 01:07:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:42.833049 | orchestrator | 2026-01-06 01:07:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:42.834351 | orchestrator | 2026-01-06 01:07:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:42.834487 | orchestrator | 2026-01-06 01:07:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:45.884269 | orchestrator | 2026-01-06 01:07:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:45.886454 | orchestrator | 2026-01-06 01:07:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:45.886548 | orchestrator | 2026-01-06 01:07:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:48.945653 | orchestrator | 2026-01-06 01:07:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:48.946603 | orchestrator | 2026-01-06 01:07:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:48.946644 | orchestrator | 2026-01-06 01:07:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:51.990324 | orchestrator | 2026-01-06 01:07:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:51.992113 | orchestrator | 2026-01-06 01:07:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:51.992173 | orchestrator | 2026-01-06 01:07:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:55.051738 | orchestrator | 2026-01-06 01:07:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:55.052642 | orchestrator | 2026-01-06 01:07:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:55.052685 | orchestrator | 2026-01-06 01:07:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:07:58.095219 | orchestrator | 2026-01-06 01:07:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:07:58.096903 | orchestrator | 2026-01-06 01:07:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:07:58.096988 | orchestrator | 2026-01-06 01:07:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:01.142385 | orchestrator | 2026-01-06 01:08:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:01.143976 | orchestrator | 2026-01-06 01:08:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:01.144127 | orchestrator | 2026-01-06 01:08:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:04.191431 | orchestrator | 2026-01-06 01:08:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:04.194214 | orchestrator | 2026-01-06 01:08:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:04.194677 | orchestrator | 2026-01-06 01:08:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:07.240687 | orchestrator | 2026-01-06 01:08:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:07.243491 | orchestrator | 2026-01-06 01:08:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:07.243557 | orchestrator | 2026-01-06 01:08:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:10.289752 | orchestrator | 2026-01-06 01:08:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:10.291239 | orchestrator | 2026-01-06 01:08:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:10.291368 | orchestrator | 2026-01-06 01:08:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:13.338441 | orchestrator | 2026-01-06 01:08:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:13.339931 | orchestrator | 2026-01-06 01:08:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:13.339991 | orchestrator | 2026-01-06 01:08:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:16.389421 | orchestrator | 2026-01-06 01:08:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:16.391349 | orchestrator | 2026-01-06 01:08:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:16.391393 | orchestrator | 2026-01-06 01:08:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:19.434090 | orchestrator | 2026-01-06 01:08:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:19.436175 | orchestrator | 2026-01-06 01:08:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:19.436259 | orchestrator | 2026-01-06 01:08:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:22.484728 | orchestrator | 2026-01-06 01:08:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:22.487375 | orchestrator | 2026-01-06 01:08:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:22.487460 | orchestrator | 2026-01-06 01:08:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:25.538633 | orchestrator | 2026-01-06 01:08:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:25.539158 | orchestrator | 2026-01-06 01:08:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:25.539249 | orchestrator | 2026-01-06 01:08:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:28.582139 | orchestrator | 2026-01-06 01:08:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:28.583972 | orchestrator | 2026-01-06 01:08:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:28.584030 | orchestrator | 2026-01-06 01:08:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:31.642135 | orchestrator | 2026-01-06 01:08:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:31.642767 | orchestrator | 2026-01-06 01:08:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:31.643038 | orchestrator | 2026-01-06 01:08:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:34.689962 | orchestrator | 2026-01-06 01:08:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:34.691480 | orchestrator | 2026-01-06 01:08:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:34.691520 | orchestrator | 2026-01-06 01:08:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:37.741618 | orchestrator | 2026-01-06 01:08:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:37.743354 | orchestrator | 2026-01-06 01:08:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:37.743393 | orchestrator | 2026-01-06 01:08:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:40.787012 | orchestrator | 2026-01-06 01:08:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:40.789016 | orchestrator | 2026-01-06 01:08:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:40.789074 | orchestrator | 2026-01-06 01:08:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:43.835034 | orchestrator | 2026-01-06 01:08:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:43.838313 | orchestrator | 2026-01-06 01:08:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:43.838391 | orchestrator | 2026-01-06 01:08:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:46.887194 | orchestrator | 2026-01-06 01:08:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:46.889785 | orchestrator | 2026-01-06 01:08:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:46.890005 | orchestrator | 2026-01-06 01:08:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:49.939853 | orchestrator | 2026-01-06 01:08:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:49.941256 | orchestrator | 2026-01-06 01:08:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:49.941421 | orchestrator | 2026-01-06 01:08:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:52.989606 | orchestrator | 2026-01-06 01:08:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:52.991103 | orchestrator | 2026-01-06 01:08:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:52.991162 | orchestrator | 2026-01-06 01:08:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:56.045102 | orchestrator | 2026-01-06 01:08:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:56.046894 | orchestrator | 2026-01-06 01:08:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:56.046937 | orchestrator | 2026-01-06 01:08:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:08:59.092122 | orchestrator | 2026-01-06 01:08:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:08:59.093106 | orchestrator | 2026-01-06 01:08:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:08:59.093129 | orchestrator | 2026-01-06 01:08:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:02.143711 | orchestrator | 2026-01-06 01:09:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:02.144390 | orchestrator | 2026-01-06 01:09:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:02.144449 | orchestrator | 2026-01-06 01:09:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:05.190955 | orchestrator | 2026-01-06 01:09:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:05.192121 | orchestrator | 2026-01-06 01:09:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:05.192170 | orchestrator | 2026-01-06 01:09:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:08.240903 | orchestrator | 2026-01-06 01:09:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:08.242394 | orchestrator | 2026-01-06 01:09:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:08.242575 | orchestrator | 2026-01-06 01:09:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:11.284524 | orchestrator | 2026-01-06 01:09:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:11.286613 | orchestrator | 2026-01-06 01:09:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:11.287132 | orchestrator | 2026-01-06 01:09:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:14.336128 | orchestrator | 2026-01-06 01:09:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:14.338167 | orchestrator | 2026-01-06 01:09:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:14.338214 | orchestrator | 2026-01-06 01:09:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:17.380984 | orchestrator | 2026-01-06 01:09:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:17.383335 | orchestrator | 2026-01-06 01:09:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:17.383422 | orchestrator | 2026-01-06 01:09:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:20.429562 | orchestrator | 2026-01-06 01:09:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:20.430301 | orchestrator | 2026-01-06 01:09:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:20.430335 | orchestrator | 2026-01-06 01:09:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:23.477420 | orchestrator | 2026-01-06 01:09:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:23.478711 | orchestrator | 2026-01-06 01:09:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:23.478833 | orchestrator | 2026-01-06 01:09:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:26.530292 | orchestrator | 2026-01-06 01:09:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:26.531860 | orchestrator | 2026-01-06 01:09:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:26.531901 | orchestrator | 2026-01-06 01:09:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:29.580864 | orchestrator | 2026-01-06 01:09:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:29.582602 | orchestrator | 2026-01-06 01:09:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:29.582739 | orchestrator | 2026-01-06 01:09:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:32.631307 | orchestrator | 2026-01-06 01:09:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:32.633007 | orchestrator | 2026-01-06 01:09:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:32.633314 | orchestrator | 2026-01-06 01:09:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:35.677354 | orchestrator | 2026-01-06 01:09:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:35.679805 | orchestrator | 2026-01-06 01:09:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:35.679882 | orchestrator | 2026-01-06 01:09:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:38.723087 | orchestrator | 2026-01-06 01:09:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:38.724416 | orchestrator | 2026-01-06 01:09:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:38.724474 | orchestrator | 2026-01-06 01:09:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:41.772643 | orchestrator | 2026-01-06 01:09:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:41.775044 | orchestrator | 2026-01-06 01:09:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:41.775110 | orchestrator | 2026-01-06 01:09:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:44.821161 | orchestrator | 2026-01-06 01:09:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:44.822449 | orchestrator | 2026-01-06 01:09:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:44.822551 | orchestrator | 2026-01-06 01:09:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:47.867413 | orchestrator | 2026-01-06 01:09:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:47.871402 | orchestrator | 2026-01-06 01:09:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:47.871958 | orchestrator | 2026-01-06 01:09:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:50.918277 | orchestrator | 2026-01-06 01:09:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:50.919323 | orchestrator | 2026-01-06 01:09:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:50.919340 | orchestrator | 2026-01-06 01:09:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:53.961014 | orchestrator | 2026-01-06 01:09:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:53.961169 | orchestrator | 2026-01-06 01:09:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:53.961185 | orchestrator | 2026-01-06 01:09:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:09:57.016447 | orchestrator | 2026-01-06 01:09:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:09:57.018321 | orchestrator | 2026-01-06 01:09:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:09:57.018412 | orchestrator | 2026-01-06 01:09:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:00.064033 | orchestrator | 2026-01-06 01:10:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:00.065138 | orchestrator | 2026-01-06 01:10:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:00.065399 | orchestrator | 2026-01-06 01:10:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:03.105064 | orchestrator | 2026-01-06 01:10:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:03.105448 | orchestrator | 2026-01-06 01:10:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:03.105563 | orchestrator | 2026-01-06 01:10:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:06.154359 | orchestrator | 2026-01-06 01:10:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:06.156276 | orchestrator | 2026-01-06 01:10:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:06.156352 | orchestrator | 2026-01-06 01:10:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:09.205523 | orchestrator | 2026-01-06 01:10:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:09.207320 | orchestrator | 2026-01-06 01:10:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:09.207368 | orchestrator | 2026-01-06 01:10:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:12.252508 | orchestrator | 2026-01-06 01:10:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:12.254822 | orchestrator | 2026-01-06 01:10:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:12.254877 | orchestrator | 2026-01-06 01:10:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:15.301405 | orchestrator | 2026-01-06 01:10:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:15.302506 | orchestrator | 2026-01-06 01:10:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:15.302659 | orchestrator | 2026-01-06 01:10:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:18.352580 | orchestrator | 2026-01-06 01:10:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:18.354500 | orchestrator | 2026-01-06 01:10:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:18.354595 | orchestrator | 2026-01-06 01:10:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:21.395765 | orchestrator | 2026-01-06 01:10:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:21.396460 | orchestrator | 2026-01-06 01:10:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:21.396532 | orchestrator | 2026-01-06 01:10:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:24.462524 | orchestrator | 2026-01-06 01:10:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:24.463892 | orchestrator | 2026-01-06 01:10:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:24.463941 | orchestrator | 2026-01-06 01:10:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:27.504486 | orchestrator | 2026-01-06 01:10:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:27.505199 | orchestrator | 2026-01-06 01:10:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:27.505279 | orchestrator | 2026-01-06 01:10:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:30.560431 | orchestrator | 2026-01-06 01:10:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:30.561571 | orchestrator | 2026-01-06 01:10:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:30.561797 | orchestrator | 2026-01-06 01:10:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:33.615039 | orchestrator | 2026-01-06 01:10:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:33.620150 | orchestrator | 2026-01-06 01:10:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:33.620336 | orchestrator | 2026-01-06 01:10:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:36.676365 | orchestrator | 2026-01-06 01:10:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:36.679518 | orchestrator | 2026-01-06 01:10:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:36.679571 | orchestrator | 2026-01-06 01:10:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:39.725954 | orchestrator | 2026-01-06 01:10:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:39.727408 | orchestrator | 2026-01-06 01:10:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:39.727504 | orchestrator | 2026-01-06 01:10:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:42.776113 | orchestrator | 2026-01-06 01:10:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:42.777561 | orchestrator | 2026-01-06 01:10:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:42.777626 | orchestrator | 2026-01-06 01:10:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:45.826810 | orchestrator | 2026-01-06 01:10:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:45.828245 | orchestrator | 2026-01-06 01:10:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:45.828337 | orchestrator | 2026-01-06 01:10:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:48.875526 | orchestrator | 2026-01-06 01:10:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:48.877068 | orchestrator | 2026-01-06 01:10:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:48.877124 | orchestrator | 2026-01-06 01:10:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:51.930396 | orchestrator | 2026-01-06 01:10:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:51.932313 | orchestrator | 2026-01-06 01:10:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:51.932374 | orchestrator | 2026-01-06 01:10:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:54.986697 | orchestrator | 2026-01-06 01:10:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:54.988854 | orchestrator | 2026-01-06 01:10:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:54.989031 | orchestrator | 2026-01-06 01:10:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:10:58.039658 | orchestrator | 2026-01-06 01:10:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:10:58.041055 | orchestrator | 2026-01-06 01:10:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:10:58.041164 | orchestrator | 2026-01-06 01:10:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:01.087982 | orchestrator | 2026-01-06 01:11:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:01.090268 | orchestrator | 2026-01-06 01:11:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:01.090330 | orchestrator | 2026-01-06 01:11:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:04.138398 | orchestrator | 2026-01-06 01:11:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:04.140099 | orchestrator | 2026-01-06 01:11:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:04.140377 | orchestrator | 2026-01-06 01:11:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:07.187852 | orchestrator | 2026-01-06 01:11:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:07.189054 | orchestrator | 2026-01-06 01:11:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:07.189084 | orchestrator | 2026-01-06 01:11:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:10.236537 | orchestrator | 2026-01-06 01:11:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:10.237762 | orchestrator | 2026-01-06 01:11:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:10.238082 | orchestrator | 2026-01-06 01:11:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:13.288071 | orchestrator | 2026-01-06 01:11:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:13.290783 | orchestrator | 2026-01-06 01:11:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:13.290830 | orchestrator | 2026-01-06 01:11:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:16.339830 | orchestrator | 2026-01-06 01:11:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:16.340928 | orchestrator | 2026-01-06 01:11:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:16.340989 | orchestrator | 2026-01-06 01:11:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:19.390954 | orchestrator | 2026-01-06 01:11:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:19.393645 | orchestrator | 2026-01-06 01:11:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:19.393774 | orchestrator | 2026-01-06 01:11:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:22.439068 | orchestrator | 2026-01-06 01:11:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:22.441261 | orchestrator | 2026-01-06 01:11:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:22.441326 | orchestrator | 2026-01-06 01:11:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:25.494337 | orchestrator | 2026-01-06 01:11:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:25.496095 | orchestrator | 2026-01-06 01:11:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:25.496231 | orchestrator | 2026-01-06 01:11:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:28.547892 | orchestrator | 2026-01-06 01:11:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:28.549855 | orchestrator | 2026-01-06 01:11:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:28.549904 | orchestrator | 2026-01-06 01:11:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:31.602179 | orchestrator | 2026-01-06 01:11:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:31.604089 | orchestrator | 2026-01-06 01:11:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:31.604128 | orchestrator | 2026-01-06 01:11:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:34.647624 | orchestrator | 2026-01-06 01:11:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:34.649478 | orchestrator | 2026-01-06 01:11:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:34.649519 | orchestrator | 2026-01-06 01:11:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:37.698259 | orchestrator | 2026-01-06 01:11:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:37.699644 | orchestrator | 2026-01-06 01:11:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:37.699901 | orchestrator | 2026-01-06 01:11:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:40.756335 | orchestrator | 2026-01-06 01:11:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:40.765380 | orchestrator | 2026-01-06 01:11:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:40.765500 | orchestrator | 2026-01-06 01:11:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:43.810406 | orchestrator | 2026-01-06 01:11:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:43.811718 | orchestrator | 2026-01-06 01:11:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:43.811995 | orchestrator | 2026-01-06 01:11:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:46.859197 | orchestrator | 2026-01-06 01:11:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:46.860282 | orchestrator | 2026-01-06 01:11:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:46.860357 | orchestrator | 2026-01-06 01:11:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:49.909496 | orchestrator | 2026-01-06 01:11:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:49.912198 | orchestrator | 2026-01-06 01:11:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:49.912271 | orchestrator | 2026-01-06 01:11:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:52.966686 | orchestrator | 2026-01-06 01:11:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:52.968101 | orchestrator | 2026-01-06 01:11:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:52.968137 | orchestrator | 2026-01-06 01:11:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:56.017746 | orchestrator | 2026-01-06 01:11:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:56.019146 | orchestrator | 2026-01-06 01:11:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:56.019314 | orchestrator | 2026-01-06 01:11:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:11:59.074236 | orchestrator | 2026-01-06 01:11:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:11:59.076657 | orchestrator | 2026-01-06 01:11:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:11:59.076711 | orchestrator | 2026-01-06 01:11:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:02.118368 | orchestrator | 2026-01-06 01:12:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:02.120224 | orchestrator | 2026-01-06 01:12:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:02.120260 | orchestrator | 2026-01-06 01:12:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:05.171067 | orchestrator | 2026-01-06 01:12:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:05.172618 | orchestrator | 2026-01-06 01:12:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:05.172840 | orchestrator | 2026-01-06 01:12:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:08.221459 | orchestrator | 2026-01-06 01:12:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:08.223874 | orchestrator | 2026-01-06 01:12:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:08.223976 | orchestrator | 2026-01-06 01:12:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:11.280618 | orchestrator | 2026-01-06 01:12:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:11.281797 | orchestrator | 2026-01-06 01:12:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:11.281849 | orchestrator | 2026-01-06 01:12:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:14.332567 | orchestrator | 2026-01-06 01:12:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:14.333577 | orchestrator | 2026-01-06 01:12:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:14.333614 | orchestrator | 2026-01-06 01:12:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:17.380871 | orchestrator | 2026-01-06 01:12:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:17.383670 | orchestrator | 2026-01-06 01:12:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:17.383829 | orchestrator | 2026-01-06 01:12:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:20.428628 | orchestrator | 2026-01-06 01:12:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:20.430510 | orchestrator | 2026-01-06 01:12:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:20.430879 | orchestrator | 2026-01-06 01:12:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:23.476844 | orchestrator | 2026-01-06 01:12:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:23.478562 | orchestrator | 2026-01-06 01:12:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:23.478578 | orchestrator | 2026-01-06 01:12:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:26.526482 | orchestrator | 2026-01-06 01:12:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:26.527262 | orchestrator | 2026-01-06 01:12:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:26.527348 | orchestrator | 2026-01-06 01:12:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:29.580481 | orchestrator | 2026-01-06 01:12:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:29.582440 | orchestrator | 2026-01-06 01:12:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:29.582522 | orchestrator | 2026-01-06 01:12:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:32.632863 | orchestrator | 2026-01-06 01:12:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:32.634541 | orchestrator | 2026-01-06 01:12:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:32.634819 | orchestrator | 2026-01-06 01:12:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:35.691482 | orchestrator | 2026-01-06 01:12:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:35.692931 | orchestrator | 2026-01-06 01:12:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:35.692981 | orchestrator | 2026-01-06 01:12:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:38.748138 | orchestrator | 2026-01-06 01:12:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:38.751294 | orchestrator | 2026-01-06 01:12:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:38.751347 | orchestrator | 2026-01-06 01:12:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:41.801911 | orchestrator | 2026-01-06 01:12:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:41.806238 | orchestrator | 2026-01-06 01:12:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:41.806314 | orchestrator | 2026-01-06 01:12:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:44.858150 | orchestrator | 2026-01-06 01:12:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:44.861637 | orchestrator | 2026-01-06 01:12:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:44.861739 | orchestrator | 2026-01-06 01:12:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:47.914314 | orchestrator | 2026-01-06 01:12:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:47.916343 | orchestrator | 2026-01-06 01:12:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:47.916408 | orchestrator | 2026-01-06 01:12:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:50.964951 | orchestrator | 2026-01-06 01:12:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:50.966522 | orchestrator | 2026-01-06 01:12:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:50.966578 | orchestrator | 2026-01-06 01:12:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:54.023100 | orchestrator | 2026-01-06 01:12:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:54.029267 | orchestrator | 2026-01-06 01:12:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:54.029370 | orchestrator | 2026-01-06 01:12:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:12:57.070392 | orchestrator | 2026-01-06 01:12:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:12:57.071629 | orchestrator | 2026-01-06 01:12:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:12:57.071735 | orchestrator | 2026-01-06 01:12:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:00.120359 | orchestrator | 2026-01-06 01:13:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:00.120681 | orchestrator | 2026-01-06 01:13:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:00.120787 | orchestrator | 2026-01-06 01:13:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:03.176205 | orchestrator | 2026-01-06 01:13:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:03.177679 | orchestrator | 2026-01-06 01:13:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:03.178014 | orchestrator | 2026-01-06 01:13:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:06.225678 | orchestrator | 2026-01-06 01:13:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:06.227630 | orchestrator | 2026-01-06 01:13:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:06.227715 | orchestrator | 2026-01-06 01:13:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:09.282324 | orchestrator | 2026-01-06 01:13:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:09.284721 | orchestrator | 2026-01-06 01:13:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:09.284799 | orchestrator | 2026-01-06 01:13:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:12.332276 | orchestrator | 2026-01-06 01:13:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:12.332609 | orchestrator | 2026-01-06 01:13:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:12.332810 | orchestrator | 2026-01-06 01:13:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:15.389414 | orchestrator | 2026-01-06 01:13:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:15.391804 | orchestrator | 2026-01-06 01:13:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:15.392218 | orchestrator | 2026-01-06 01:13:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:18.443295 | orchestrator | 2026-01-06 01:13:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:18.444516 | orchestrator | 2026-01-06 01:13:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:18.444554 | orchestrator | 2026-01-06 01:13:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:21.491088 | orchestrator | 2026-01-06 01:13:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:21.493106 | orchestrator | 2026-01-06 01:13:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:21.493181 | orchestrator | 2026-01-06 01:13:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:24.534449 | orchestrator | 2026-01-06 01:13:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:24.535512 | orchestrator | 2026-01-06 01:13:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:24.535550 | orchestrator | 2026-01-06 01:13:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:27.580572 | orchestrator | 2026-01-06 01:13:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:27.582991 | orchestrator | 2026-01-06 01:13:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:27.583058 | orchestrator | 2026-01-06 01:13:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:30.630962 | orchestrator | 2026-01-06 01:13:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:30.631791 | orchestrator | 2026-01-06 01:13:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:30.631821 | orchestrator | 2026-01-06 01:13:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:33.675116 | orchestrator | 2026-01-06 01:13:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:33.677916 | orchestrator | 2026-01-06 01:13:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:33.677982 | orchestrator | 2026-01-06 01:13:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:36.731882 | orchestrator | 2026-01-06 01:13:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:36.733060 | orchestrator | 2026-01-06 01:13:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:36.733123 | orchestrator | 2026-01-06 01:13:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:39.787502 | orchestrator | 2026-01-06 01:13:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:39.789986 | orchestrator | 2026-01-06 01:13:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:39.790093 | orchestrator | 2026-01-06 01:13:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:42.842402 | orchestrator | 2026-01-06 01:13:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:42.845259 | orchestrator | 2026-01-06 01:13:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:42.845910 | orchestrator | 2026-01-06 01:13:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:45.903591 | orchestrator | 2026-01-06 01:13:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:45.905090 | orchestrator | 2026-01-06 01:13:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:45.905160 | orchestrator | 2026-01-06 01:13:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:48.949881 | orchestrator | 2026-01-06 01:13:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:48.951667 | orchestrator | 2026-01-06 01:13:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:48.951728 | orchestrator | 2026-01-06 01:13:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:51.996983 | orchestrator | 2026-01-06 01:13:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:51.999004 | orchestrator | 2026-01-06 01:13:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:51.999073 | orchestrator | 2026-01-06 01:13:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:55.051176 | orchestrator | 2026-01-06 01:13:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:55.052065 | orchestrator | 2026-01-06 01:13:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:55.052106 | orchestrator | 2026-01-06 01:13:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:13:58.097197 | orchestrator | 2026-01-06 01:13:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:13:58.099147 | orchestrator | 2026-01-06 01:13:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:13:58.099206 | orchestrator | 2026-01-06 01:13:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:01.151690 | orchestrator | 2026-01-06 01:14:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:01.153033 | orchestrator | 2026-01-06 01:14:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:01.153178 | orchestrator | 2026-01-06 01:14:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:04.205159 | orchestrator | 2026-01-06 01:14:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:04.206797 | orchestrator | 2026-01-06 01:14:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:04.206883 | orchestrator | 2026-01-06 01:14:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:07.254384 | orchestrator | 2026-01-06 01:14:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:07.256574 | orchestrator | 2026-01-06 01:14:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:07.256642 | orchestrator | 2026-01-06 01:14:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:10.308223 | orchestrator | 2026-01-06 01:14:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:10.309922 | orchestrator | 2026-01-06 01:14:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:10.310332 | orchestrator | 2026-01-06 01:14:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:13.356969 | orchestrator | 2026-01-06 01:14:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:13.358047 | orchestrator | 2026-01-06 01:14:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:13.358396 | orchestrator | 2026-01-06 01:14:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:16.412408 | orchestrator | 2026-01-06 01:14:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:16.414102 | orchestrator | 2026-01-06 01:14:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:16.414140 | orchestrator | 2026-01-06 01:14:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:19.458889 | orchestrator | 2026-01-06 01:14:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:19.459120 | orchestrator | 2026-01-06 01:14:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:19.459334 | orchestrator | 2026-01-06 01:14:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:22.500611 | orchestrator | 2026-01-06 01:14:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:22.502422 | orchestrator | 2026-01-06 01:14:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:22.502554 | orchestrator | 2026-01-06 01:14:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:25.540783 | orchestrator | 2026-01-06 01:14:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:25.541354 | orchestrator | 2026-01-06 01:14:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:25.541505 | orchestrator | 2026-01-06 01:14:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:28.593005 | orchestrator | 2026-01-06 01:14:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:28.594308 | orchestrator | 2026-01-06 01:14:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:28.594355 | orchestrator | 2026-01-06 01:14:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:31.649802 | orchestrator | 2026-01-06 01:14:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:31.650304 | orchestrator | 2026-01-06 01:14:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:31.650460 | orchestrator | 2026-01-06 01:14:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:34.692382 | orchestrator | 2026-01-06 01:14:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:34.693540 | orchestrator | 2026-01-06 01:14:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:34.693595 | orchestrator | 2026-01-06 01:14:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:37.740173 | orchestrator | 2026-01-06 01:14:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:37.741620 | orchestrator | 2026-01-06 01:14:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:37.741656 | orchestrator | 2026-01-06 01:14:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:40.789500 | orchestrator | 2026-01-06 01:14:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:40.792083 | orchestrator | 2026-01-06 01:14:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:40.792163 | orchestrator | 2026-01-06 01:14:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:43.835615 | orchestrator | 2026-01-06 01:14:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:43.837601 | orchestrator | 2026-01-06 01:14:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:43.837696 | orchestrator | 2026-01-06 01:14:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:46.890931 | orchestrator | 2026-01-06 01:14:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:46.893002 | orchestrator | 2026-01-06 01:14:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:46.893067 | orchestrator | 2026-01-06 01:14:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:49.942168 | orchestrator | 2026-01-06 01:14:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:49.943420 | orchestrator | 2026-01-06 01:14:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:49.943486 | orchestrator | 2026-01-06 01:14:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:52.987877 | orchestrator | 2026-01-06 01:14:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:52.988039 | orchestrator | 2026-01-06 01:14:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:52.988052 | orchestrator | 2026-01-06 01:14:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:56.036154 | orchestrator | 2026-01-06 01:14:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:56.038822 | orchestrator | 2026-01-06 01:14:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:56.038916 | orchestrator | 2026-01-06 01:14:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:14:59.086403 | orchestrator | 2026-01-06 01:14:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:14:59.087823 | orchestrator | 2026-01-06 01:14:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:14:59.087876 | orchestrator | 2026-01-06 01:14:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:02.137433 | orchestrator | 2026-01-06 01:15:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:02.138667 | orchestrator | 2026-01-06 01:15:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:02.138713 | orchestrator | 2026-01-06 01:15:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:05.179364 | orchestrator | 2026-01-06 01:15:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:05.179695 | orchestrator | 2026-01-06 01:15:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:05.179726 | orchestrator | 2026-01-06 01:15:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:08.231455 | orchestrator | 2026-01-06 01:15:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:08.233834 | orchestrator | 2026-01-06 01:15:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:08.234354 | orchestrator | 2026-01-06 01:15:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:11.279250 | orchestrator | 2026-01-06 01:15:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:11.280269 | orchestrator | 2026-01-06 01:15:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:11.280327 | orchestrator | 2026-01-06 01:15:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:14.329500 | orchestrator | 2026-01-06 01:15:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:14.331608 | orchestrator | 2026-01-06 01:15:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:14.331703 | orchestrator | 2026-01-06 01:15:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:17.382138 | orchestrator | 2026-01-06 01:15:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:17.383026 | orchestrator | 2026-01-06 01:15:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:17.383059 | orchestrator | 2026-01-06 01:15:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:20.425596 | orchestrator | 2026-01-06 01:15:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:20.426891 | orchestrator | 2026-01-06 01:15:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:20.426924 | orchestrator | 2026-01-06 01:15:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:23.471032 | orchestrator | 2026-01-06 01:15:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:23.472603 | orchestrator | 2026-01-06 01:15:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:23.472662 | orchestrator | 2026-01-06 01:15:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:26.521972 | orchestrator | 2026-01-06 01:15:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:26.523887 | orchestrator | 2026-01-06 01:15:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:26.524031 | orchestrator | 2026-01-06 01:15:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:29.573951 | orchestrator | 2026-01-06 01:15:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:29.574967 | orchestrator | 2026-01-06 01:15:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:29.575088 | orchestrator | 2026-01-06 01:15:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:32.619485 | orchestrator | 2026-01-06 01:15:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:32.621122 | orchestrator | 2026-01-06 01:15:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:32.621362 | orchestrator | 2026-01-06 01:15:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:35.676028 | orchestrator | 2026-01-06 01:15:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:35.677657 | orchestrator | 2026-01-06 01:15:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:35.677992 | orchestrator | 2026-01-06 01:15:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:38.726096 | orchestrator | 2026-01-06 01:15:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:38.727145 | orchestrator | 2026-01-06 01:15:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:38.727189 | orchestrator | 2026-01-06 01:15:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:41.776290 | orchestrator | 2026-01-06 01:15:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:41.777857 | orchestrator | 2026-01-06 01:15:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:41.777906 | orchestrator | 2026-01-06 01:15:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:44.828674 | orchestrator | 2026-01-06 01:15:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:44.829810 | orchestrator | 2026-01-06 01:15:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:44.829954 | orchestrator | 2026-01-06 01:15:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:47.880380 | orchestrator | 2026-01-06 01:15:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:47.883077 | orchestrator | 2026-01-06 01:15:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:47.883119 | orchestrator | 2026-01-06 01:15:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:50.925768 | orchestrator | 2026-01-06 01:15:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:50.926445 | orchestrator | 2026-01-06 01:15:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:50.926544 | orchestrator | 2026-01-06 01:15:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:53.976220 | orchestrator | 2026-01-06 01:15:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:53.978464 | orchestrator | 2026-01-06 01:15:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:53.978556 | orchestrator | 2026-01-06 01:15:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:15:57.035218 | orchestrator | 2026-01-06 01:15:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:15:57.035675 | orchestrator | 2026-01-06 01:15:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:15:57.035714 | orchestrator | 2026-01-06 01:15:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:00.081683 | orchestrator | 2026-01-06 01:16:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:00.085189 | orchestrator | 2026-01-06 01:16:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:00.085248 | orchestrator | 2026-01-06 01:16:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:03.137520 | orchestrator | 2026-01-06 01:16:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:03.140962 | orchestrator | 2026-01-06 01:16:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:03.141042 | orchestrator | 2026-01-06 01:16:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:06.194273 | orchestrator | 2026-01-06 01:16:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:06.197366 | orchestrator | 2026-01-06 01:16:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:06.197460 | orchestrator | 2026-01-06 01:16:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:09.244264 | orchestrator | 2026-01-06 01:16:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:09.245912 | orchestrator | 2026-01-06 01:16:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:09.246079 | orchestrator | 2026-01-06 01:16:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:12.300336 | orchestrator | 2026-01-06 01:16:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:12.302460 | orchestrator | 2026-01-06 01:16:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:12.302544 | orchestrator | 2026-01-06 01:16:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:15.353870 | orchestrator | 2026-01-06 01:16:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:15.356034 | orchestrator | 2026-01-06 01:16:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:15.356134 | orchestrator | 2026-01-06 01:16:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:18.406444 | orchestrator | 2026-01-06 01:16:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:18.407251 | orchestrator | 2026-01-06 01:16:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:18.407339 | orchestrator | 2026-01-06 01:16:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:21.453865 | orchestrator | 2026-01-06 01:16:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:21.456191 | orchestrator | 2026-01-06 01:16:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:21.456261 | orchestrator | 2026-01-06 01:16:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:24.511340 | orchestrator | 2026-01-06 01:16:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:24.514417 | orchestrator | 2026-01-06 01:16:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:24.514483 | orchestrator | 2026-01-06 01:16:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:27.564032 | orchestrator | 2026-01-06 01:16:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:27.566514 | orchestrator | 2026-01-06 01:16:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:27.566636 | orchestrator | 2026-01-06 01:16:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:30.618586 | orchestrator | 2026-01-06 01:16:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:30.620302 | orchestrator | 2026-01-06 01:16:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:30.620366 | orchestrator | 2026-01-06 01:16:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:33.665412 | orchestrator | 2026-01-06 01:16:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:33.667133 | orchestrator | 2026-01-06 01:16:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:33.667258 | orchestrator | 2026-01-06 01:16:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:36.719635 | orchestrator | 2026-01-06 01:16:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:36.721274 | orchestrator | 2026-01-06 01:16:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:36.721368 | orchestrator | 2026-01-06 01:16:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:39.761301 | orchestrator | 2026-01-06 01:16:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:39.763005 | orchestrator | 2026-01-06 01:16:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:39.763086 | orchestrator | 2026-01-06 01:16:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:42.810106 | orchestrator | 2026-01-06 01:16:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:42.811791 | orchestrator | 2026-01-06 01:16:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:42.812029 | orchestrator | 2026-01-06 01:16:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:45.860661 | orchestrator | 2026-01-06 01:16:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:45.862987 | orchestrator | 2026-01-06 01:16:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:45.863270 | orchestrator | 2026-01-06 01:16:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:48.908400 | orchestrator | 2026-01-06 01:16:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:48.909934 | orchestrator | 2026-01-06 01:16:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:48.910227 | orchestrator | 2026-01-06 01:16:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:51.957220 | orchestrator | 2026-01-06 01:16:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:51.959946 | orchestrator | 2026-01-06 01:16:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:51.960075 | orchestrator | 2026-01-06 01:16:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:55.009052 | orchestrator | 2026-01-06 01:16:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:55.010101 | orchestrator | 2026-01-06 01:16:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:55.010144 | orchestrator | 2026-01-06 01:16:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:16:58.064138 | orchestrator | 2026-01-06 01:16:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:16:58.065262 | orchestrator | 2026-01-06 01:16:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:16:58.065313 | orchestrator | 2026-01-06 01:16:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:01.108216 | orchestrator | 2026-01-06 01:17:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:01.110318 | orchestrator | 2026-01-06 01:17:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:01.110483 | orchestrator | 2026-01-06 01:17:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:04.155850 | orchestrator | 2026-01-06 01:17:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:04.156761 | orchestrator | 2026-01-06 01:17:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:04.156812 | orchestrator | 2026-01-06 01:17:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:07.209567 | orchestrator | 2026-01-06 01:17:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:07.211881 | orchestrator | 2026-01-06 01:17:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:07.212237 | orchestrator | 2026-01-06 01:17:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:10.267747 | orchestrator | 2026-01-06 01:17:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:10.270540 | orchestrator | 2026-01-06 01:17:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:10.270665 | orchestrator | 2026-01-06 01:17:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:13.314387 | orchestrator | 2026-01-06 01:17:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:13.316592 | orchestrator | 2026-01-06 01:17:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:13.316682 | orchestrator | 2026-01-06 01:17:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:16.364072 | orchestrator | 2026-01-06 01:17:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:16.366200 | orchestrator | 2026-01-06 01:17:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:16.366628 | orchestrator | 2026-01-06 01:17:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:19.413298 | orchestrator | 2026-01-06 01:17:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:19.414654 | orchestrator | 2026-01-06 01:17:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:19.415005 | orchestrator | 2026-01-06 01:17:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:22.463046 | orchestrator | 2026-01-06 01:17:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:22.465854 | orchestrator | 2026-01-06 01:17:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:22.465923 | orchestrator | 2026-01-06 01:17:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:25.508496 | orchestrator | 2026-01-06 01:17:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:25.509639 | orchestrator | 2026-01-06 01:17:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:25.509672 | orchestrator | 2026-01-06 01:17:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:28.557764 | orchestrator | 2026-01-06 01:17:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:28.559234 | orchestrator | 2026-01-06 01:17:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:28.559275 | orchestrator | 2026-01-06 01:17:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:31.605380 | orchestrator | 2026-01-06 01:17:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:31.606685 | orchestrator | 2026-01-06 01:17:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:31.606769 | orchestrator | 2026-01-06 01:17:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:34.651544 | orchestrator | 2026-01-06 01:17:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:34.653550 | orchestrator | 2026-01-06 01:17:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:34.653605 | orchestrator | 2026-01-06 01:17:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:37.698967 | orchestrator | 2026-01-06 01:17:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:37.700362 | orchestrator | 2026-01-06 01:17:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:37.700384 | orchestrator | 2026-01-06 01:17:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:40.749221 | orchestrator | 2026-01-06 01:17:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:40.750910 | orchestrator | 2026-01-06 01:17:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:40.750958 | orchestrator | 2026-01-06 01:17:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:43.795208 | orchestrator | 2026-01-06 01:17:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:43.796899 | orchestrator | 2026-01-06 01:17:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:43.796933 | orchestrator | 2026-01-06 01:17:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:46.851077 | orchestrator | 2026-01-06 01:17:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:46.853049 | orchestrator | 2026-01-06 01:17:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:46.853396 | orchestrator | 2026-01-06 01:17:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:49.906477 | orchestrator | 2026-01-06 01:17:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:49.907886 | orchestrator | 2026-01-06 01:17:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:49.907950 | orchestrator | 2026-01-06 01:17:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:52.965487 | orchestrator | 2026-01-06 01:17:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:52.967235 | orchestrator | 2026-01-06 01:17:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:52.967288 | orchestrator | 2026-01-06 01:17:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:56.022555 | orchestrator | 2026-01-06 01:17:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:56.024681 | orchestrator | 2026-01-06 01:17:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:56.024760 | orchestrator | 2026-01-06 01:17:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:17:59.071601 | orchestrator | 2026-01-06 01:17:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:17:59.072707 | orchestrator | 2026-01-06 01:17:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:17:59.072757 | orchestrator | 2026-01-06 01:17:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:02.122782 | orchestrator | 2026-01-06 01:18:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:02.125545 | orchestrator | 2026-01-06 01:18:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:02.125614 | orchestrator | 2026-01-06 01:18:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:05.170447 | orchestrator | 2026-01-06 01:18:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:05.173011 | orchestrator | 2026-01-06 01:18:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:05.173148 | orchestrator | 2026-01-06 01:18:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:08.223196 | orchestrator | 2026-01-06 01:18:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:08.224752 | orchestrator | 2026-01-06 01:18:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:08.224796 | orchestrator | 2026-01-06 01:18:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:11.265994 | orchestrator | 2026-01-06 01:18:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:11.268255 | orchestrator | 2026-01-06 01:18:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:11.268327 | orchestrator | 2026-01-06 01:18:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:14.319054 | orchestrator | 2026-01-06 01:18:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:14.321428 | orchestrator | 2026-01-06 01:18:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:14.321879 | orchestrator | 2026-01-06 01:18:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:17.367282 | orchestrator | 2026-01-06 01:18:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:17.368891 | orchestrator | 2026-01-06 01:18:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:17.368941 | orchestrator | 2026-01-06 01:18:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:20.411125 | orchestrator | 2026-01-06 01:18:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:20.412409 | orchestrator | 2026-01-06 01:18:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:20.412519 | orchestrator | 2026-01-06 01:18:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:23.456776 | orchestrator | 2026-01-06 01:18:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:23.458438 | orchestrator | 2026-01-06 01:18:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:23.458497 | orchestrator | 2026-01-06 01:18:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:26.513876 | orchestrator | 2026-01-06 01:18:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:26.514819 | orchestrator | 2026-01-06 01:18:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:26.515560 | orchestrator | 2026-01-06 01:18:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:29.561557 | orchestrator | 2026-01-06 01:18:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:29.563241 | orchestrator | 2026-01-06 01:18:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:29.563363 | orchestrator | 2026-01-06 01:18:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:32.609219 | orchestrator | 2026-01-06 01:18:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:32.610187 | orchestrator | 2026-01-06 01:18:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:32.610229 | orchestrator | 2026-01-06 01:18:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:35.657799 | orchestrator | 2026-01-06 01:18:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:35.659077 | orchestrator | 2026-01-06 01:18:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:35.659165 | orchestrator | 2026-01-06 01:18:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:38.705261 | orchestrator | 2026-01-06 01:18:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:38.705748 | orchestrator | 2026-01-06 01:18:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:38.705812 | orchestrator | 2026-01-06 01:18:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:41.755464 | orchestrator | 2026-01-06 01:18:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:41.758005 | orchestrator | 2026-01-06 01:18:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:41.758160 | orchestrator | 2026-01-06 01:18:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:44.805693 | orchestrator | 2026-01-06 01:18:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:44.806577 | orchestrator | 2026-01-06 01:18:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:44.806738 | orchestrator | 2026-01-06 01:18:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:47.854540 | orchestrator | 2026-01-06 01:18:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:47.856788 | orchestrator | 2026-01-06 01:18:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:47.856835 | orchestrator | 2026-01-06 01:18:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:50.906501 | orchestrator | 2026-01-06 01:18:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:50.907725 | orchestrator | 2026-01-06 01:18:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:50.907785 | orchestrator | 2026-01-06 01:18:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:53.961863 | orchestrator | 2026-01-06 01:18:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:53.963634 | orchestrator | 2026-01-06 01:18:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:53.963719 | orchestrator | 2026-01-06 01:18:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:18:57.012985 | orchestrator | 2026-01-06 01:18:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:18:57.014520 | orchestrator | 2026-01-06 01:18:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:18:57.014565 | orchestrator | 2026-01-06 01:18:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:00.061960 | orchestrator | 2026-01-06 01:19:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:00.064556 | orchestrator | 2026-01-06 01:19:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:00.064606 | orchestrator | 2026-01-06 01:19:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:03.108973 | orchestrator | 2026-01-06 01:19:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:03.111377 | orchestrator | 2026-01-06 01:19:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:03.111572 | orchestrator | 2026-01-06 01:19:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:06.153723 | orchestrator | 2026-01-06 01:19:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:06.155495 | orchestrator | 2026-01-06 01:19:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:06.155556 | orchestrator | 2026-01-06 01:19:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:09.199101 | orchestrator | 2026-01-06 01:19:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:09.201636 | orchestrator | 2026-01-06 01:19:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:09.201794 | orchestrator | 2026-01-06 01:19:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:12.257642 | orchestrator | 2026-01-06 01:19:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:12.260236 | orchestrator | 2026-01-06 01:19:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:12.260304 | orchestrator | 2026-01-06 01:19:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:15.308445 | orchestrator | 2026-01-06 01:19:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:15.309372 | orchestrator | 2026-01-06 01:19:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:15.309462 | orchestrator | 2026-01-06 01:19:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:18.360178 | orchestrator | 2026-01-06 01:19:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:18.361722 | orchestrator | 2026-01-06 01:19:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:18.361927 | orchestrator | 2026-01-06 01:19:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:21.410976 | orchestrator | 2026-01-06 01:19:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:21.412370 | orchestrator | 2026-01-06 01:19:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:21.412415 | orchestrator | 2026-01-06 01:19:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:24.459281 | orchestrator | 2026-01-06 01:19:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:24.460503 | orchestrator | 2026-01-06 01:19:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:24.460525 | orchestrator | 2026-01-06 01:19:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:27.508721 | orchestrator | 2026-01-06 01:19:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:27.509962 | orchestrator | 2026-01-06 01:19:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:27.509988 | orchestrator | 2026-01-06 01:19:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:30.554567 | orchestrator | 2026-01-06 01:19:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:30.556139 | orchestrator | 2026-01-06 01:19:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:30.556193 | orchestrator | 2026-01-06 01:19:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:33.603862 | orchestrator | 2026-01-06 01:19:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:33.606466 | orchestrator | 2026-01-06 01:19:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:33.606564 | orchestrator | 2026-01-06 01:19:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:36.651953 | orchestrator | 2026-01-06 01:19:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:36.654410 | orchestrator | 2026-01-06 01:19:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:36.654487 | orchestrator | 2026-01-06 01:19:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:39.701803 | orchestrator | 2026-01-06 01:19:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:39.703917 | orchestrator | 2026-01-06 01:19:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:39.704028 | orchestrator | 2026-01-06 01:19:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:42.753070 | orchestrator | 2026-01-06 01:19:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:42.754417 | orchestrator | 2026-01-06 01:19:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:42.754495 | orchestrator | 2026-01-06 01:19:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:45.808942 | orchestrator | 2026-01-06 01:19:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:45.811173 | orchestrator | 2026-01-06 01:19:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:45.811278 | orchestrator | 2026-01-06 01:19:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:48.860670 | orchestrator | 2026-01-06 01:19:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:48.863197 | orchestrator | 2026-01-06 01:19:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:48.863270 | orchestrator | 2026-01-06 01:19:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:51.908017 | orchestrator | 2026-01-06 01:19:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:51.908212 | orchestrator | 2026-01-06 01:19:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:51.908234 | orchestrator | 2026-01-06 01:19:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:54.950120 | orchestrator | 2026-01-06 01:19:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:54.952684 | orchestrator | 2026-01-06 01:19:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:54.952761 | orchestrator | 2026-01-06 01:19:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:19:58.005094 | orchestrator | 2026-01-06 01:19:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:19:58.006899 | orchestrator | 2026-01-06 01:19:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:19:58.006965 | orchestrator | 2026-01-06 01:19:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:01.057769 | orchestrator | 2026-01-06 01:20:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:01.059938 | orchestrator | 2026-01-06 01:20:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:01.059979 | orchestrator | 2026-01-06 01:20:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:04.108436 | orchestrator | 2026-01-06 01:20:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:04.108519 | orchestrator | 2026-01-06 01:20:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:04.108526 | orchestrator | 2026-01-06 01:20:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:07.157531 | orchestrator | 2026-01-06 01:20:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:07.159119 | orchestrator | 2026-01-06 01:20:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:07.159173 | orchestrator | 2026-01-06 01:20:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:10.211921 | orchestrator | 2026-01-06 01:20:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:10.213421 | orchestrator | 2026-01-06 01:20:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:10.213475 | orchestrator | 2026-01-06 01:20:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:13.272719 | orchestrator | 2026-01-06 01:20:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:13.275934 | orchestrator | 2026-01-06 01:20:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:13.276095 | orchestrator | 2026-01-06 01:20:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:16.325824 | orchestrator | 2026-01-06 01:20:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:16.327210 | orchestrator | 2026-01-06 01:20:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:16.327334 | orchestrator | 2026-01-06 01:20:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:19.375743 | orchestrator | 2026-01-06 01:20:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:19.377364 | orchestrator | 2026-01-06 01:20:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:19.377466 | orchestrator | 2026-01-06 01:20:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:22.425938 | orchestrator | 2026-01-06 01:20:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:22.426901 | orchestrator | 2026-01-06 01:20:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:22.426983 | orchestrator | 2026-01-06 01:20:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:25.477564 | orchestrator | 2026-01-06 01:20:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:25.477941 | orchestrator | 2026-01-06 01:20:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:25.477976 | orchestrator | 2026-01-06 01:20:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:28.525410 | orchestrator | 2026-01-06 01:20:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:28.527628 | orchestrator | 2026-01-06 01:20:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:28.527721 | orchestrator | 2026-01-06 01:20:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:31.579023 | orchestrator | 2026-01-06 01:20:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:31.580118 | orchestrator | 2026-01-06 01:20:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:31.580215 | orchestrator | 2026-01-06 01:20:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:34.628632 | orchestrator | 2026-01-06 01:20:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:34.630749 | orchestrator | 2026-01-06 01:20:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:34.630815 | orchestrator | 2026-01-06 01:20:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:37.676868 | orchestrator | 2026-01-06 01:20:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:37.678865 | orchestrator | 2026-01-06 01:20:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:37.678916 | orchestrator | 2026-01-06 01:20:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:40.726542 | orchestrator | 2026-01-06 01:20:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:40.728394 | orchestrator | 2026-01-06 01:20:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:40.728488 | orchestrator | 2026-01-06 01:20:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:43.781584 | orchestrator | 2026-01-06 01:20:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:43.782147 | orchestrator | 2026-01-06 01:20:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:43.782185 | orchestrator | 2026-01-06 01:20:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:46.832746 | orchestrator | 2026-01-06 01:20:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:46.834465 | orchestrator | 2026-01-06 01:20:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:46.834702 | orchestrator | 2026-01-06 01:20:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:49.878479 | orchestrator | 2026-01-06 01:20:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:49.881851 | orchestrator | 2026-01-06 01:20:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:49.881936 | orchestrator | 2026-01-06 01:20:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:52.919860 | orchestrator | 2026-01-06 01:20:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:52.922723 | orchestrator | 2026-01-06 01:20:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:52.922781 | orchestrator | 2026-01-06 01:20:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:55.965231 | orchestrator | 2026-01-06 01:20:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:55.967771 | orchestrator | 2026-01-06 01:20:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:55.967837 | orchestrator | 2026-01-06 01:20:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:20:59.019442 | orchestrator | 2026-01-06 01:20:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:20:59.021201 | orchestrator | 2026-01-06 01:20:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:20:59.021250 | orchestrator | 2026-01-06 01:20:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:02.070092 | orchestrator | 2026-01-06 01:21:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:02.072406 | orchestrator | 2026-01-06 01:21:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:02.072469 | orchestrator | 2026-01-06 01:21:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:05.119012 | orchestrator | 2026-01-06 01:21:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:05.122444 | orchestrator | 2026-01-06 01:21:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:05.123087 | orchestrator | 2026-01-06 01:21:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:08.170689 | orchestrator | 2026-01-06 01:21:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:08.173295 | orchestrator | 2026-01-06 01:21:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:08.173449 | orchestrator | 2026-01-06 01:21:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:11.216872 | orchestrator | 2026-01-06 01:21:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:11.217849 | orchestrator | 2026-01-06 01:21:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:11.218140 | orchestrator | 2026-01-06 01:21:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:14.271808 | orchestrator | 2026-01-06 01:21:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:14.273870 | orchestrator | 2026-01-06 01:21:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:14.273938 | orchestrator | 2026-01-06 01:21:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:17.333436 | orchestrator | 2026-01-06 01:21:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:17.335506 | orchestrator | 2026-01-06 01:21:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:17.335656 | orchestrator | 2026-01-06 01:21:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:20.378527 | orchestrator | 2026-01-06 01:21:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:20.381374 | orchestrator | 2026-01-06 01:21:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:20.381544 | orchestrator | 2026-01-06 01:21:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:23.428366 | orchestrator | 2026-01-06 01:21:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:23.429694 | orchestrator | 2026-01-06 01:21:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:23.429740 | orchestrator | 2026-01-06 01:21:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:26.474274 | orchestrator | 2026-01-06 01:21:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:26.475941 | orchestrator | 2026-01-06 01:21:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:26.475991 | orchestrator | 2026-01-06 01:21:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:29.526183 | orchestrator | 2026-01-06 01:21:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:29.527321 | orchestrator | 2026-01-06 01:21:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:29.527351 | orchestrator | 2026-01-06 01:21:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:32.574368 | orchestrator | 2026-01-06 01:21:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:32.575653 | orchestrator | 2026-01-06 01:21:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:32.575693 | orchestrator | 2026-01-06 01:21:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:35.629976 | orchestrator | 2026-01-06 01:21:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:35.631978 | orchestrator | 2026-01-06 01:21:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:35.632031 | orchestrator | 2026-01-06 01:21:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:38.679070 | orchestrator | 2026-01-06 01:21:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:38.680859 | orchestrator | 2026-01-06 01:21:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:38.680915 | orchestrator | 2026-01-06 01:21:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:41.728675 | orchestrator | 2026-01-06 01:21:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:41.730228 | orchestrator | 2026-01-06 01:21:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:41.730285 | orchestrator | 2026-01-06 01:21:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:44.783794 | orchestrator | 2026-01-06 01:21:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:44.784567 | orchestrator | 2026-01-06 01:21:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:44.784608 | orchestrator | 2026-01-06 01:21:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:47.830945 | orchestrator | 2026-01-06 01:21:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:47.832905 | orchestrator | 2026-01-06 01:21:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:47.832937 | orchestrator | 2026-01-06 01:21:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:50.881578 | orchestrator | 2026-01-06 01:21:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:50.884870 | orchestrator | 2026-01-06 01:21:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:50.885286 | orchestrator | 2026-01-06 01:21:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:53.931481 | orchestrator | 2026-01-06 01:21:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:53.932201 | orchestrator | 2026-01-06 01:21:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:53.932300 | orchestrator | 2026-01-06 01:21:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:21:56.981134 | orchestrator | 2026-01-06 01:21:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:21:56.983293 | orchestrator | 2026-01-06 01:21:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:21:56.983401 | orchestrator | 2026-01-06 01:21:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:00.038760 | orchestrator | 2026-01-06 01:22:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:00.042547 | orchestrator | 2026-01-06 01:22:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:00.042733 | orchestrator | 2026-01-06 01:22:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:03.095417 | orchestrator | 2026-01-06 01:22:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:03.099133 | orchestrator | 2026-01-06 01:22:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:03.099355 | orchestrator | 2026-01-06 01:22:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:06.153444 | orchestrator | 2026-01-06 01:22:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:06.154857 | orchestrator | 2026-01-06 01:22:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:06.154928 | orchestrator | 2026-01-06 01:22:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:09.206244 | orchestrator | 2026-01-06 01:22:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:09.207205 | orchestrator | 2026-01-06 01:22:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:09.207241 | orchestrator | 2026-01-06 01:22:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:12.253590 | orchestrator | 2026-01-06 01:22:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:12.255198 | orchestrator | 2026-01-06 01:22:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:12.255302 | orchestrator | 2026-01-06 01:22:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:15.307718 | orchestrator | 2026-01-06 01:22:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:15.309514 | orchestrator | 2026-01-06 01:22:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:15.309621 | orchestrator | 2026-01-06 01:22:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:18.356681 | orchestrator | 2026-01-06 01:22:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:18.362314 | orchestrator | 2026-01-06 01:22:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:18.362419 | orchestrator | 2026-01-06 01:22:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:21.413237 | orchestrator | 2026-01-06 01:22:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:21.415395 | orchestrator | 2026-01-06 01:22:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:21.415457 | orchestrator | 2026-01-06 01:22:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:24.463358 | orchestrator | 2026-01-06 01:22:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:24.463937 | orchestrator | 2026-01-06 01:22:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:24.463962 | orchestrator | 2026-01-06 01:22:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:27.513167 | orchestrator | 2026-01-06 01:22:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:27.515977 | orchestrator | 2026-01-06 01:22:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:27.516130 | orchestrator | 2026-01-06 01:22:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:30.563568 | orchestrator | 2026-01-06 01:22:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:30.564561 | orchestrator | 2026-01-06 01:22:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:30.564685 | orchestrator | 2026-01-06 01:22:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:33.617434 | orchestrator | 2026-01-06 01:22:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:33.619388 | orchestrator | 2026-01-06 01:22:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:33.619433 | orchestrator | 2026-01-06 01:22:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:36.668689 | orchestrator | 2026-01-06 01:22:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:36.669827 | orchestrator | 2026-01-06 01:22:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:36.669923 | orchestrator | 2026-01-06 01:22:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:39.720316 | orchestrator | 2026-01-06 01:22:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:39.721913 | orchestrator | 2026-01-06 01:22:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:39.721986 | orchestrator | 2026-01-06 01:22:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:42.771629 | orchestrator | 2026-01-06 01:22:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:42.774685 | orchestrator | 2026-01-06 01:22:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:42.774747 | orchestrator | 2026-01-06 01:22:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:45.824765 | orchestrator | 2026-01-06 01:22:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:45.827669 | orchestrator | 2026-01-06 01:22:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:45.827742 | orchestrator | 2026-01-06 01:22:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:48.874737 | orchestrator | 2026-01-06 01:22:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:48.877289 | orchestrator | 2026-01-06 01:22:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:48.877356 | orchestrator | 2026-01-06 01:22:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:51.917936 | orchestrator | 2026-01-06 01:22:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:51.919107 | orchestrator | 2026-01-06 01:22:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:51.919243 | orchestrator | 2026-01-06 01:22:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:54.968360 | orchestrator | 2026-01-06 01:22:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:54.970915 | orchestrator | 2026-01-06 01:22:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:54.971004 | orchestrator | 2026-01-06 01:22:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:22:58.022960 | orchestrator | 2026-01-06 01:22:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:22:58.024362 | orchestrator | 2026-01-06 01:22:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:22:58.024477 | orchestrator | 2026-01-06 01:22:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:01.069455 | orchestrator | 2026-01-06 01:23:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:01.072611 | orchestrator | 2026-01-06 01:23:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:01.072714 | orchestrator | 2026-01-06 01:23:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:04.133186 | orchestrator | 2026-01-06 01:23:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:04.134588 | orchestrator | 2026-01-06 01:23:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:04.134618 | orchestrator | 2026-01-06 01:23:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:07.189977 | orchestrator | 2026-01-06 01:23:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:07.191749 | orchestrator | 2026-01-06 01:23:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:07.191784 | orchestrator | 2026-01-06 01:23:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:10.243661 | orchestrator | 2026-01-06 01:23:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:10.244832 | orchestrator | 2026-01-06 01:23:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:10.244914 | orchestrator | 2026-01-06 01:23:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:13.298483 | orchestrator | 2026-01-06 01:23:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:13.300397 | orchestrator | 2026-01-06 01:23:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:13.300532 | orchestrator | 2026-01-06 01:23:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:16.348515 | orchestrator | 2026-01-06 01:23:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:16.351194 | orchestrator | 2026-01-06 01:23:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:16.351455 | orchestrator | 2026-01-06 01:23:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:19.405922 | orchestrator | 2026-01-06 01:23:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:19.407224 | orchestrator | 2026-01-06 01:23:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:19.407267 | orchestrator | 2026-01-06 01:23:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:22.457675 | orchestrator | 2026-01-06 01:23:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:22.459047 | orchestrator | 2026-01-06 01:23:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:22.459168 | orchestrator | 2026-01-06 01:23:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:25.508088 | orchestrator | 2026-01-06 01:23:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:25.509831 | orchestrator | 2026-01-06 01:23:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:25.509968 | orchestrator | 2026-01-06 01:23:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:28.561238 | orchestrator | 2026-01-06 01:23:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:28.562907 | orchestrator | 2026-01-06 01:23:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:28.563036 | orchestrator | 2026-01-06 01:23:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:31.609119 | orchestrator | 2026-01-06 01:23:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:31.609230 | orchestrator | 2026-01-06 01:23:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:31.609316 | orchestrator | 2026-01-06 01:23:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:34.659359 | orchestrator | 2026-01-06 01:23:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:34.661407 | orchestrator | 2026-01-06 01:23:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:34.661559 | orchestrator | 2026-01-06 01:23:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:37.708716 | orchestrator | 2026-01-06 01:23:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:37.710205 | orchestrator | 2026-01-06 01:23:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:37.710248 | orchestrator | 2026-01-06 01:23:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:40.761276 | orchestrator | 2026-01-06 01:23:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:40.763111 | orchestrator | 2026-01-06 01:23:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:40.763146 | orchestrator | 2026-01-06 01:23:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:43.812073 | orchestrator | 2026-01-06 01:23:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:43.814504 | orchestrator | 2026-01-06 01:23:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:43.814574 | orchestrator | 2026-01-06 01:23:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:46.866835 | orchestrator | 2026-01-06 01:23:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:46.868506 | orchestrator | 2026-01-06 01:23:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:46.868572 | orchestrator | 2026-01-06 01:23:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:49.921447 | orchestrator | 2026-01-06 01:23:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:49.923168 | orchestrator | 2026-01-06 01:23:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:49.923212 | orchestrator | 2026-01-06 01:23:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:52.961213 | orchestrator | 2026-01-06 01:23:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:52.961814 | orchestrator | 2026-01-06 01:23:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:52.962093 | orchestrator | 2026-01-06 01:23:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:56.013948 | orchestrator | 2026-01-06 01:23:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:56.015193 | orchestrator | 2026-01-06 01:23:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:56.015246 | orchestrator | 2026-01-06 01:23:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:23:59.065592 | orchestrator | 2026-01-06 01:23:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:23:59.067720 | orchestrator | 2026-01-06 01:23:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:23:59.067852 | orchestrator | 2026-01-06 01:23:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:02.111827 | orchestrator | 2026-01-06 01:24:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:02.112952 | orchestrator | 2026-01-06 01:24:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:02.112999 | orchestrator | 2026-01-06 01:24:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:05.158942 | orchestrator | 2026-01-06 01:24:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:05.161564 | orchestrator | 2026-01-06 01:24:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:05.161797 | orchestrator | 2026-01-06 01:24:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:08.213298 | orchestrator | 2026-01-06 01:24:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:08.215235 | orchestrator | 2026-01-06 01:24:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:08.215323 | orchestrator | 2026-01-06 01:24:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:11.266782 | orchestrator | 2026-01-06 01:24:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:11.267453 | orchestrator | 2026-01-06 01:24:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:11.267491 | orchestrator | 2026-01-06 01:24:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:14.317332 | orchestrator | 2026-01-06 01:24:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:14.319187 | orchestrator | 2026-01-06 01:24:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:14.319296 | orchestrator | 2026-01-06 01:24:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:17.366466 | orchestrator | 2026-01-06 01:24:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:17.368877 | orchestrator | 2026-01-06 01:24:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:17.368957 | orchestrator | 2026-01-06 01:24:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:20.419867 | orchestrator | 2026-01-06 01:24:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:20.420770 | orchestrator | 2026-01-06 01:24:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:20.420970 | orchestrator | 2026-01-06 01:24:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:23.480311 | orchestrator | 2026-01-06 01:24:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:23.482084 | orchestrator | 2026-01-06 01:24:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:23.482194 | orchestrator | 2026-01-06 01:24:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:26.537982 | orchestrator | 2026-01-06 01:24:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:26.539872 | orchestrator | 2026-01-06 01:24:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:26.539949 | orchestrator | 2026-01-06 01:24:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:29.582576 | orchestrator | 2026-01-06 01:24:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:29.582851 | orchestrator | 2026-01-06 01:24:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:29.582875 | orchestrator | 2026-01-06 01:24:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:32.636128 | orchestrator | 2026-01-06 01:24:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:32.637854 | orchestrator | 2026-01-06 01:24:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:32.638097 | orchestrator | 2026-01-06 01:24:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:35.686850 | orchestrator | 2026-01-06 01:24:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:35.688337 | orchestrator | 2026-01-06 01:24:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:35.688578 | orchestrator | 2026-01-06 01:24:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:38.739586 | orchestrator | 2026-01-06 01:24:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:38.741018 | orchestrator | 2026-01-06 01:24:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:38.741068 | orchestrator | 2026-01-06 01:24:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:41.791231 | orchestrator | 2026-01-06 01:24:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:41.791995 | orchestrator | 2026-01-06 01:24:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:41.792297 | orchestrator | 2026-01-06 01:24:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:44.836410 | orchestrator | 2026-01-06 01:24:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:44.837991 | orchestrator | 2026-01-06 01:24:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:44.838140 | orchestrator | 2026-01-06 01:24:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:47.885487 | orchestrator | 2026-01-06 01:24:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:47.887754 | orchestrator | 2026-01-06 01:24:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:47.887816 | orchestrator | 2026-01-06 01:24:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:50.933622 | orchestrator | 2026-01-06 01:24:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:50.934725 | orchestrator | 2026-01-06 01:24:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:50.934793 | orchestrator | 2026-01-06 01:24:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:53.979251 | orchestrator | 2026-01-06 01:24:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:53.979467 | orchestrator | 2026-01-06 01:24:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:53.979488 | orchestrator | 2026-01-06 01:24:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:24:57.030064 | orchestrator | 2026-01-06 01:24:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:24:57.030792 | orchestrator | 2026-01-06 01:24:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:24:57.031017 | orchestrator | 2026-01-06 01:24:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:00.072964 | orchestrator | 2026-01-06 01:25:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:00.075030 | orchestrator | 2026-01-06 01:25:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:00.075106 | orchestrator | 2026-01-06 01:25:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:03.119464 | orchestrator | 2026-01-06 01:25:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:03.120535 | orchestrator | 2026-01-06 01:25:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:03.120556 | orchestrator | 2026-01-06 01:25:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:06.162788 | orchestrator | 2026-01-06 01:25:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:06.164681 | orchestrator | 2026-01-06 01:25:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:06.165043 | orchestrator | 2026-01-06 01:25:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:09.220513 | orchestrator | 2026-01-06 01:25:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:09.223205 | orchestrator | 2026-01-06 01:25:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:09.223322 | orchestrator | 2026-01-06 01:25:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:12.283260 | orchestrator | 2026-01-06 01:25:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:12.285186 | orchestrator | 2026-01-06 01:25:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:12.285248 | orchestrator | 2026-01-06 01:25:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:15.333150 | orchestrator | 2026-01-06 01:25:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:15.336669 | orchestrator | 2026-01-06 01:25:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:15.336800 | orchestrator | 2026-01-06 01:25:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:18.392390 | orchestrator | 2026-01-06 01:25:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:18.393948 | orchestrator | 2026-01-06 01:25:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:18.393982 | orchestrator | 2026-01-06 01:25:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:21.441831 | orchestrator | 2026-01-06 01:25:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:21.444122 | orchestrator | 2026-01-06 01:25:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:21.444205 | orchestrator | 2026-01-06 01:25:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:24.494641 | orchestrator | 2026-01-06 01:25:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:24.496550 | orchestrator | 2026-01-06 01:25:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:24.496775 | orchestrator | 2026-01-06 01:25:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:27.549276 | orchestrator | 2026-01-06 01:25:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:27.550453 | orchestrator | 2026-01-06 01:25:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:27.550519 | orchestrator | 2026-01-06 01:25:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:30.597810 | orchestrator | 2026-01-06 01:25:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:30.599273 | orchestrator | 2026-01-06 01:25:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:30.599324 | orchestrator | 2026-01-06 01:25:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:33.649684 | orchestrator | 2026-01-06 01:25:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:33.651008 | orchestrator | 2026-01-06 01:25:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:33.651056 | orchestrator | 2026-01-06 01:25:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:36.699176 | orchestrator | 2026-01-06 01:25:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:36.700481 | orchestrator | 2026-01-06 01:25:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:36.700578 | orchestrator | 2026-01-06 01:25:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:39.751659 | orchestrator | 2026-01-06 01:25:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:39.753248 | orchestrator | 2026-01-06 01:25:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:39.753320 | orchestrator | 2026-01-06 01:25:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:42.808410 | orchestrator | 2026-01-06 01:25:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:42.809579 | orchestrator | 2026-01-06 01:25:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:42.809643 | orchestrator | 2026-01-06 01:25:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:45.856988 | orchestrator | 2026-01-06 01:25:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:45.859038 | orchestrator | 2026-01-06 01:25:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:45.859105 | orchestrator | 2026-01-06 01:25:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:48.909559 | orchestrator | 2026-01-06 01:25:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:48.911664 | orchestrator | 2026-01-06 01:25:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:48.911720 | orchestrator | 2026-01-06 01:25:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:51.961834 | orchestrator | 2026-01-06 01:25:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:51.963612 | orchestrator | 2026-01-06 01:25:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:51.963655 | orchestrator | 2026-01-06 01:25:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:55.008812 | orchestrator | 2026-01-06 01:25:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:55.009108 | orchestrator | 2026-01-06 01:25:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:55.009156 | orchestrator | 2026-01-06 01:25:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:25:58.056434 | orchestrator | 2026-01-06 01:25:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:25:58.058214 | orchestrator | 2026-01-06 01:25:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:25:58.058296 | orchestrator | 2026-01-06 01:25:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:01.100689 | orchestrator | 2026-01-06 01:26:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:01.102448 | orchestrator | 2026-01-06 01:26:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:01.102485 | orchestrator | 2026-01-06 01:26:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:04.158298 | orchestrator | 2026-01-06 01:26:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:04.159854 | orchestrator | 2026-01-06 01:26:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:04.159913 | orchestrator | 2026-01-06 01:26:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:07.212477 | orchestrator | 2026-01-06 01:26:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:07.213885 | orchestrator | 2026-01-06 01:26:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:07.214128 | orchestrator | 2026-01-06 01:26:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:10.265889 | orchestrator | 2026-01-06 01:26:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:10.268065 | orchestrator | 2026-01-06 01:26:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:10.268110 | orchestrator | 2026-01-06 01:26:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:13.314518 | orchestrator | 2026-01-06 01:26:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:13.317240 | orchestrator | 2026-01-06 01:26:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:13.317291 | orchestrator | 2026-01-06 01:26:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:16.364414 | orchestrator | 2026-01-06 01:26:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:16.365927 | orchestrator | 2026-01-06 01:26:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:16.365959 | orchestrator | 2026-01-06 01:26:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:19.413525 | orchestrator | 2026-01-06 01:26:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:19.415607 | orchestrator | 2026-01-06 01:26:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:19.415720 | orchestrator | 2026-01-06 01:26:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:22.463426 | orchestrator | 2026-01-06 01:26:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:22.464391 | orchestrator | 2026-01-06 01:26:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:22.464465 | orchestrator | 2026-01-06 01:26:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:25.513726 | orchestrator | 2026-01-06 01:26:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:25.515051 | orchestrator | 2026-01-06 01:26:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:25.515099 | orchestrator | 2026-01-06 01:26:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:28.562586 | orchestrator | 2026-01-06 01:26:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:28.564464 | orchestrator | 2026-01-06 01:26:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:28.564551 | orchestrator | 2026-01-06 01:26:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:31.613548 | orchestrator | 2026-01-06 01:26:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:31.614927 | orchestrator | 2026-01-06 01:26:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:31.615008 | orchestrator | 2026-01-06 01:26:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:34.664527 | orchestrator | 2026-01-06 01:26:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:34.666366 | orchestrator | 2026-01-06 01:26:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:34.666408 | orchestrator | 2026-01-06 01:26:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:37.713365 | orchestrator | 2026-01-06 01:26:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:37.715127 | orchestrator | 2026-01-06 01:26:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:37.715171 | orchestrator | 2026-01-06 01:26:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:40.764177 | orchestrator | 2026-01-06 01:26:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:40.766659 | orchestrator | 2026-01-06 01:26:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:40.767169 | orchestrator | 2026-01-06 01:26:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:43.809045 | orchestrator | 2026-01-06 01:26:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:43.810142 | orchestrator | 2026-01-06 01:26:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:43.810370 | orchestrator | 2026-01-06 01:26:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:46.865663 | orchestrator | 2026-01-06 01:26:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:46.866481 | orchestrator | 2026-01-06 01:26:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:46.866532 | orchestrator | 2026-01-06 01:26:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:49.913664 | orchestrator | 2026-01-06 01:26:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:49.915850 | orchestrator | 2026-01-06 01:26:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:49.916075 | orchestrator | 2026-01-06 01:26:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:52.960551 | orchestrator | 2026-01-06 01:26:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:52.962572 | orchestrator | 2026-01-06 01:26:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:52.962701 | orchestrator | 2026-01-06 01:26:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:56.006738 | orchestrator | 2026-01-06 01:26:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:56.008203 | orchestrator | 2026-01-06 01:26:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:56.008747 | orchestrator | 2026-01-06 01:26:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:26:59.056759 | orchestrator | 2026-01-06 01:26:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:26:59.057531 | orchestrator | 2026-01-06 01:26:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:26:59.057575 | orchestrator | 2026-01-06 01:26:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:02.106990 | orchestrator | 2026-01-06 01:27:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:02.109701 | orchestrator | 2026-01-06 01:27:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:02.109790 | orchestrator | 2026-01-06 01:27:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:05.156534 | orchestrator | 2026-01-06 01:27:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:05.158583 | orchestrator | 2026-01-06 01:27:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:05.158646 | orchestrator | 2026-01-06 01:27:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:08.208904 | orchestrator | 2026-01-06 01:27:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:08.210147 | orchestrator | 2026-01-06 01:27:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:08.210202 | orchestrator | 2026-01-06 01:27:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:11.266141 | orchestrator | 2026-01-06 01:27:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:11.268144 | orchestrator | 2026-01-06 01:27:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:11.268205 | orchestrator | 2026-01-06 01:27:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:14.319444 | orchestrator | 2026-01-06 01:27:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:14.321875 | orchestrator | 2026-01-06 01:27:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:14.321947 | orchestrator | 2026-01-06 01:27:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:17.369185 | orchestrator | 2026-01-06 01:27:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:17.370354 | orchestrator | 2026-01-06 01:27:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:17.370522 | orchestrator | 2026-01-06 01:27:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:20.419208 | orchestrator | 2026-01-06 01:27:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:20.419629 | orchestrator | 2026-01-06 01:27:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:20.420624 | orchestrator | 2026-01-06 01:27:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:23.471562 | orchestrator | 2026-01-06 01:27:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:23.474276 | orchestrator | 2026-01-06 01:27:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:23.474352 | orchestrator | 2026-01-06 01:27:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:26.520290 | orchestrator | 2026-01-06 01:27:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:26.520969 | orchestrator | 2026-01-06 01:27:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:26.521006 | orchestrator | 2026-01-06 01:27:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:29.573294 | orchestrator | 2026-01-06 01:27:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:29.575228 | orchestrator | 2026-01-06 01:27:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:29.575316 | orchestrator | 2026-01-06 01:27:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:32.627346 | orchestrator | 2026-01-06 01:27:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:32.629533 | orchestrator | 2026-01-06 01:27:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:32.629613 | orchestrator | 2026-01-06 01:27:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:35.679525 | orchestrator | 2026-01-06 01:27:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:35.682327 | orchestrator | 2026-01-06 01:27:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:35.682403 | orchestrator | 2026-01-06 01:27:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:38.729911 | orchestrator | 2026-01-06 01:27:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:38.731982 | orchestrator | 2026-01-06 01:27:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:38.732026 | orchestrator | 2026-01-06 01:27:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:41.783271 | orchestrator | 2026-01-06 01:27:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:41.785321 | orchestrator | 2026-01-06 01:27:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:41.785411 | orchestrator | 2026-01-06 01:27:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:44.834967 | orchestrator | 2026-01-06 01:27:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:44.835233 | orchestrator | 2026-01-06 01:27:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:44.835256 | orchestrator | 2026-01-06 01:27:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:47.878509 | orchestrator | 2026-01-06 01:27:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:47.880022 | orchestrator | 2026-01-06 01:27:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:47.880109 | orchestrator | 2026-01-06 01:27:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:50.926225 | orchestrator | 2026-01-06 01:27:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:50.928046 | orchestrator | 2026-01-06 01:27:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:50.928107 | orchestrator | 2026-01-06 01:27:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:53.980391 | orchestrator | 2026-01-06 01:27:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:53.982995 | orchestrator | 2026-01-06 01:27:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:53.983100 | orchestrator | 2026-01-06 01:27:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:27:57.033607 | orchestrator | 2026-01-06 01:27:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:27:57.034744 | orchestrator | 2026-01-06 01:27:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:27:57.034928 | orchestrator | 2026-01-06 01:27:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:00.081689 | orchestrator | 2026-01-06 01:28:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:00.085650 | orchestrator | 2026-01-06 01:28:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:00.085719 | orchestrator | 2026-01-06 01:28:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:03.136675 | orchestrator | 2026-01-06 01:28:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:03.140455 | orchestrator | 2026-01-06 01:28:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:03.140540 | orchestrator | 2026-01-06 01:28:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:06.190577 | orchestrator | 2026-01-06 01:28:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:06.191739 | orchestrator | 2026-01-06 01:28:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:06.191794 | orchestrator | 2026-01-06 01:28:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:09.243519 | orchestrator | 2026-01-06 01:28:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:09.246180 | orchestrator | 2026-01-06 01:28:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:09.246239 | orchestrator | 2026-01-06 01:28:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:12.295601 | orchestrator | 2026-01-06 01:28:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:12.296949 | orchestrator | 2026-01-06 01:28:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:12.297004 | orchestrator | 2026-01-06 01:28:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:15.346980 | orchestrator | 2026-01-06 01:28:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:15.348545 | orchestrator | 2026-01-06 01:28:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:15.348605 | orchestrator | 2026-01-06 01:28:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:18.398846 | orchestrator | 2026-01-06 01:28:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:18.400661 | orchestrator | 2026-01-06 01:28:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:18.400745 | orchestrator | 2026-01-06 01:28:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:21.452828 | orchestrator | 2026-01-06 01:28:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:21.453931 | orchestrator | 2026-01-06 01:28:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:21.454148 | orchestrator | 2026-01-06 01:28:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:24.505973 | orchestrator | 2026-01-06 01:28:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:24.506212 | orchestrator | 2026-01-06 01:28:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:24.506281 | orchestrator | 2026-01-06 01:28:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:27.554335 | orchestrator | 2026-01-06 01:28:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:27.554854 | orchestrator | 2026-01-06 01:28:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:27.555109 | orchestrator | 2026-01-06 01:28:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:30.604687 | orchestrator | 2026-01-06 01:28:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:30.607421 | orchestrator | 2026-01-06 01:28:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:30.607512 | orchestrator | 2026-01-06 01:28:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:33.657559 | orchestrator | 2026-01-06 01:28:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:33.660430 | orchestrator | 2026-01-06 01:28:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:33.660552 | orchestrator | 2026-01-06 01:28:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:36.711056 | orchestrator | 2026-01-06 01:28:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:36.713784 | orchestrator | 2026-01-06 01:28:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:36.713843 | orchestrator | 2026-01-06 01:28:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:39.755501 | orchestrator | 2026-01-06 01:28:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:39.756528 | orchestrator | 2026-01-06 01:28:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:39.756627 | orchestrator | 2026-01-06 01:28:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:42.802289 | orchestrator | 2026-01-06 01:28:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:42.803726 | orchestrator | 2026-01-06 01:28:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:42.803787 | orchestrator | 2026-01-06 01:28:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:45.847718 | orchestrator | 2026-01-06 01:28:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:45.848729 | orchestrator | 2026-01-06 01:28:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:45.848968 | orchestrator | 2026-01-06 01:28:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:48.905651 | orchestrator | 2026-01-06 01:28:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:48.907290 | orchestrator | 2026-01-06 01:28:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:48.907637 | orchestrator | 2026-01-06 01:28:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:51.954980 | orchestrator | 2026-01-06 01:28:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:51.956425 | orchestrator | 2026-01-06 01:28:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:51.956579 | orchestrator | 2026-01-06 01:28:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:55.002327 | orchestrator | 2026-01-06 01:28:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:55.005175 | orchestrator | 2026-01-06 01:28:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:55.005247 | orchestrator | 2026-01-06 01:28:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:28:58.050422 | orchestrator | 2026-01-06 01:28:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:28:58.052800 | orchestrator | 2026-01-06 01:28:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:28:58.052858 | orchestrator | 2026-01-06 01:28:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:01.103268 | orchestrator | 2026-01-06 01:29:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:01.106356 | orchestrator | 2026-01-06 01:29:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:01.106466 | orchestrator | 2026-01-06 01:29:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:04.155366 | orchestrator | 2026-01-06 01:29:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:04.157104 | orchestrator | 2026-01-06 01:29:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:04.157212 | orchestrator | 2026-01-06 01:29:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:07.206620 | orchestrator | 2026-01-06 01:29:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:07.208859 | orchestrator | 2026-01-06 01:29:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:07.208963 | orchestrator | 2026-01-06 01:29:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:10.258294 | orchestrator | 2026-01-06 01:29:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:10.260670 | orchestrator | 2026-01-06 01:29:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:10.260722 | orchestrator | 2026-01-06 01:29:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:13.307486 | orchestrator | 2026-01-06 01:29:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:13.308273 | orchestrator | 2026-01-06 01:29:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:13.308330 | orchestrator | 2026-01-06 01:29:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:16.351226 | orchestrator | 2026-01-06 01:29:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:16.352767 | orchestrator | 2026-01-06 01:29:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:16.352810 | orchestrator | 2026-01-06 01:29:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:19.395106 | orchestrator | 2026-01-06 01:29:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:19.396063 | orchestrator | 2026-01-06 01:29:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:19.396112 | orchestrator | 2026-01-06 01:29:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:22.444576 | orchestrator | 2026-01-06 01:29:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:22.446327 | orchestrator | 2026-01-06 01:29:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:22.446389 | orchestrator | 2026-01-06 01:29:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:25.489882 | orchestrator | 2026-01-06 01:29:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:25.493723 | orchestrator | 2026-01-06 01:29:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:25.493792 | orchestrator | 2026-01-06 01:29:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:28.545731 | orchestrator | 2026-01-06 01:29:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:28.547652 | orchestrator | 2026-01-06 01:29:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:28.547688 | orchestrator | 2026-01-06 01:29:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:31.592441 | orchestrator | 2026-01-06 01:29:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:31.594255 | orchestrator | 2026-01-06 01:29:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:31.595477 | orchestrator | 2026-01-06 01:29:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:34.641857 | orchestrator | 2026-01-06 01:29:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:34.643849 | orchestrator | 2026-01-06 01:29:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:34.643912 | orchestrator | 2026-01-06 01:29:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:37.694060 | orchestrator | 2026-01-06 01:29:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:37.694714 | orchestrator | 2026-01-06 01:29:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:37.695526 | orchestrator | 2026-01-06 01:29:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:40.743048 | orchestrator | 2026-01-06 01:29:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:40.744613 | orchestrator | 2026-01-06 01:29:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:40.744672 | orchestrator | 2026-01-06 01:29:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:43.795682 | orchestrator | 2026-01-06 01:29:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:43.797610 | orchestrator | 2026-01-06 01:29:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:43.797700 | orchestrator | 2026-01-06 01:29:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:46.852432 | orchestrator | 2026-01-06 01:29:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:46.854571 | orchestrator | 2026-01-06 01:29:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:46.854650 | orchestrator | 2026-01-06 01:29:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:49.909029 | orchestrator | 2026-01-06 01:29:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:49.911367 | orchestrator | 2026-01-06 01:29:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:49.911406 | orchestrator | 2026-01-06 01:29:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:52.966599 | orchestrator | 2026-01-06 01:29:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:52.968401 | orchestrator | 2026-01-06 01:29:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:52.968443 | orchestrator | 2026-01-06 01:29:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:56.020672 | orchestrator | 2026-01-06 01:29:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:56.022178 | orchestrator | 2026-01-06 01:29:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:56.022247 | orchestrator | 2026-01-06 01:29:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:29:59.070826 | orchestrator | 2026-01-06 01:29:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:29:59.076547 | orchestrator | 2026-01-06 01:29:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:29:59.076586 | orchestrator | 2026-01-06 01:29:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:02.124173 | orchestrator | 2026-01-06 01:30:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:02.127219 | orchestrator | 2026-01-06 01:30:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:02.127269 | orchestrator | 2026-01-06 01:30:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:05.177490 | orchestrator | 2026-01-06 01:30:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:05.179926 | orchestrator | 2026-01-06 01:30:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:05.180156 | orchestrator | 2026-01-06 01:30:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:08.241053 | orchestrator | 2026-01-06 01:30:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:08.241161 | orchestrator | 2026-01-06 01:30:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:08.241179 | orchestrator | 2026-01-06 01:30:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:11.275267 | orchestrator | 2026-01-06 01:30:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:11.278185 | orchestrator | 2026-01-06 01:30:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:11.278274 | orchestrator | 2026-01-06 01:30:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:14.329798 | orchestrator | 2026-01-06 01:30:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:14.330672 | orchestrator | 2026-01-06 01:30:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:14.330696 | orchestrator | 2026-01-06 01:30:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:17.374767 | orchestrator | 2026-01-06 01:30:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:17.376759 | orchestrator | 2026-01-06 01:30:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:17.376847 | orchestrator | 2026-01-06 01:30:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:20.425337 | orchestrator | 2026-01-06 01:30:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:20.425595 | orchestrator | 2026-01-06 01:30:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:20.425627 | orchestrator | 2026-01-06 01:30:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:23.474646 | orchestrator | 2026-01-06 01:30:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:23.475518 | orchestrator | 2026-01-06 01:30:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:23.475623 | orchestrator | 2026-01-06 01:30:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:26.518288 | orchestrator | 2026-01-06 01:30:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:26.518413 | orchestrator | 2026-01-06 01:30:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:26.518429 | orchestrator | 2026-01-06 01:30:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:29.565434 | orchestrator | 2026-01-06 01:30:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:29.566547 | orchestrator | 2026-01-06 01:30:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:29.566668 | orchestrator | 2026-01-06 01:30:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:32.620780 | orchestrator | 2026-01-06 01:30:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:32.625188 | orchestrator | 2026-01-06 01:30:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:32.625283 | orchestrator | 2026-01-06 01:30:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:35.671275 | orchestrator | 2026-01-06 01:30:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:35.672058 | orchestrator | 2026-01-06 01:30:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:35.672092 | orchestrator | 2026-01-06 01:30:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:38.715716 | orchestrator | 2026-01-06 01:30:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:38.716496 | orchestrator | 2026-01-06 01:30:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:38.716580 | orchestrator | 2026-01-06 01:30:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:41.759360 | orchestrator | 2026-01-06 01:30:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:41.760933 | orchestrator | 2026-01-06 01:30:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:41.761028 | orchestrator | 2026-01-06 01:30:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:44.812963 | orchestrator | 2026-01-06 01:30:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:44.814553 | orchestrator | 2026-01-06 01:30:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:44.815223 | orchestrator | 2026-01-06 01:30:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:47.861209 | orchestrator | 2026-01-06 01:30:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:47.864045 | orchestrator | 2026-01-06 01:30:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:47.864297 | orchestrator | 2026-01-06 01:30:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:50.913254 | orchestrator | 2026-01-06 01:30:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:50.915568 | orchestrator | 2026-01-06 01:30:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:50.916180 | orchestrator | 2026-01-06 01:30:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:53.967415 | orchestrator | 2026-01-06 01:30:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:53.969348 | orchestrator | 2026-01-06 01:30:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:53.969394 | orchestrator | 2026-01-06 01:30:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:30:57.021719 | orchestrator | 2026-01-06 01:30:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:30:57.022775 | orchestrator | 2026-01-06 01:30:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:30:57.022853 | orchestrator | 2026-01-06 01:30:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:00.070765 | orchestrator | 2026-01-06 01:31:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:00.072894 | orchestrator | 2026-01-06 01:31:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:00.073074 | orchestrator | 2026-01-06 01:31:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:03.120292 | orchestrator | 2026-01-06 01:31:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:03.121669 | orchestrator | 2026-01-06 01:31:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:03.121723 | orchestrator | 2026-01-06 01:31:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:06.163089 | orchestrator | 2026-01-06 01:31:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:06.164777 | orchestrator | 2026-01-06 01:31:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:06.164910 | orchestrator | 2026-01-06 01:31:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:09.206964 | orchestrator | 2026-01-06 01:31:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:09.209160 | orchestrator | 2026-01-06 01:31:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:09.209230 | orchestrator | 2026-01-06 01:31:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:12.259334 | orchestrator | 2026-01-06 01:31:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:12.261136 | orchestrator | 2026-01-06 01:31:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:12.261203 | orchestrator | 2026-01-06 01:31:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:15.306527 | orchestrator | 2026-01-06 01:31:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:15.307226 | orchestrator | 2026-01-06 01:31:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:15.307268 | orchestrator | 2026-01-06 01:31:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:18.361971 | orchestrator | 2026-01-06 01:31:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:18.362196 | orchestrator | 2026-01-06 01:31:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:18.362214 | orchestrator | 2026-01-06 01:31:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:21.413524 | orchestrator | 2026-01-06 01:31:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:21.415858 | orchestrator | 2026-01-06 01:31:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:21.415974 | orchestrator | 2026-01-06 01:31:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:24.458705 | orchestrator | 2026-01-06 01:31:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:24.459741 | orchestrator | 2026-01-06 01:31:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:24.459820 | orchestrator | 2026-01-06 01:31:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:27.509662 | orchestrator | 2026-01-06 01:31:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:27.511320 | orchestrator | 2026-01-06 01:31:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:27.511348 | orchestrator | 2026-01-06 01:31:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:30.556291 | orchestrator | 2026-01-06 01:31:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:30.558949 | orchestrator | 2026-01-06 01:31:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:30.559134 | orchestrator | 2026-01-06 01:31:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:33.601487 | orchestrator | 2026-01-06 01:31:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:33.601931 | orchestrator | 2026-01-06 01:31:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:33.601960 | orchestrator | 2026-01-06 01:31:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:36.656556 | orchestrator | 2026-01-06 01:31:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:36.659008 | orchestrator | 2026-01-06 01:31:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:36.659172 | orchestrator | 2026-01-06 01:31:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:39.713360 | orchestrator | 2026-01-06 01:31:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:39.714870 | orchestrator | 2026-01-06 01:31:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:39.714933 | orchestrator | 2026-01-06 01:31:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:42.766851 | orchestrator | 2026-01-06 01:31:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:42.769419 | orchestrator | 2026-01-06 01:31:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:42.769474 | orchestrator | 2026-01-06 01:31:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:45.821571 | orchestrator | 2026-01-06 01:31:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:45.823565 | orchestrator | 2026-01-06 01:31:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:45.823693 | orchestrator | 2026-01-06 01:31:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:48.869750 | orchestrator | 2026-01-06 01:31:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:48.871691 | orchestrator | 2026-01-06 01:31:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:48.871748 | orchestrator | 2026-01-06 01:31:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:51.920954 | orchestrator | 2026-01-06 01:31:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:51.923457 | orchestrator | 2026-01-06 01:31:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:51.923521 | orchestrator | 2026-01-06 01:31:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:54.976357 | orchestrator | 2026-01-06 01:31:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:54.978380 | orchestrator | 2026-01-06 01:31:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:54.978542 | orchestrator | 2026-01-06 01:31:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:31:58.025431 | orchestrator | 2026-01-06 01:31:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:31:58.026200 | orchestrator | 2026-01-06 01:31:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:31:58.026276 | orchestrator | 2026-01-06 01:31:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:01.072280 | orchestrator | 2026-01-06 01:32:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:01.073700 | orchestrator | 2026-01-06 01:32:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:01.073965 | orchestrator | 2026-01-06 01:32:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:04.119982 | orchestrator | 2026-01-06 01:32:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:04.121551 | orchestrator | 2026-01-06 01:32:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:04.121590 | orchestrator | 2026-01-06 01:32:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:07.167360 | orchestrator | 2026-01-06 01:32:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:07.169420 | orchestrator | 2026-01-06 01:32:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:07.169469 | orchestrator | 2026-01-06 01:32:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:10.213721 | orchestrator | 2026-01-06 01:32:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:10.215546 | orchestrator | 2026-01-06 01:32:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:10.215745 | orchestrator | 2026-01-06 01:32:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:13.268094 | orchestrator | 2026-01-06 01:32:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:13.270448 | orchestrator | 2026-01-06 01:32:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:13.270575 | orchestrator | 2026-01-06 01:32:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:16.317854 | orchestrator | 2026-01-06 01:32:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:16.319502 | orchestrator | 2026-01-06 01:32:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:16.319567 | orchestrator | 2026-01-06 01:32:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:19.366432 | orchestrator | 2026-01-06 01:32:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:19.368514 | orchestrator | 2026-01-06 01:32:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:19.369153 | orchestrator | 2026-01-06 01:32:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:22.416959 | orchestrator | 2026-01-06 01:32:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:22.419611 | orchestrator | 2026-01-06 01:32:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:22.419696 | orchestrator | 2026-01-06 01:32:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:25.460375 | orchestrator | 2026-01-06 01:32:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:25.461706 | orchestrator | 2026-01-06 01:32:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:25.461745 | orchestrator | 2026-01-06 01:32:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:28.496213 | orchestrator | 2026-01-06 01:32:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:28.497556 | orchestrator | 2026-01-06 01:32:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:28.497612 | orchestrator | 2026-01-06 01:32:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:31.547660 | orchestrator | 2026-01-06 01:32:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:31.548525 | orchestrator | 2026-01-06 01:32:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:31.548566 | orchestrator | 2026-01-06 01:32:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:34.591607 | orchestrator | 2026-01-06 01:32:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:34.593638 | orchestrator | 2026-01-06 01:32:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:34.593774 | orchestrator | 2026-01-06 01:32:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:37.639629 | orchestrator | 2026-01-06 01:32:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:37.641401 | orchestrator | 2026-01-06 01:32:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:37.641486 | orchestrator | 2026-01-06 01:32:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:40.692783 | orchestrator | 2026-01-06 01:32:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:40.694793 | orchestrator | 2026-01-06 01:32:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:40.695102 | orchestrator | 2026-01-06 01:32:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:43.744101 | orchestrator | 2026-01-06 01:32:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:43.745466 | orchestrator | 2026-01-06 01:32:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:43.745563 | orchestrator | 2026-01-06 01:32:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:46.792922 | orchestrator | 2026-01-06 01:32:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:46.794612 | orchestrator | 2026-01-06 01:32:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:46.794646 | orchestrator | 2026-01-06 01:32:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:49.838310 | orchestrator | 2026-01-06 01:32:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:49.839551 | orchestrator | 2026-01-06 01:32:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:49.839589 | orchestrator | 2026-01-06 01:32:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:52.889087 | orchestrator | 2026-01-06 01:32:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:52.890333 | orchestrator | 2026-01-06 01:32:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:52.890370 | orchestrator | 2026-01-06 01:32:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:55.942412 | orchestrator | 2026-01-06 01:32:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:55.944414 | orchestrator | 2026-01-06 01:32:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:55.944476 | orchestrator | 2026-01-06 01:32:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:32:58.995811 | orchestrator | 2026-01-06 01:32:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:32:58.997407 | orchestrator | 2026-01-06 01:32:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:32:58.997468 | orchestrator | 2026-01-06 01:32:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:02.046347 | orchestrator | 2026-01-06 01:33:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:02.048517 | orchestrator | 2026-01-06 01:33:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:02.048566 | orchestrator | 2026-01-06 01:33:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:05.095619 | orchestrator | 2026-01-06 01:33:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:05.097049 | orchestrator | 2026-01-06 01:33:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:05.097267 | orchestrator | 2026-01-06 01:33:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:08.144189 | orchestrator | 2026-01-06 01:33:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:08.145787 | orchestrator | 2026-01-06 01:33:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:08.145856 | orchestrator | 2026-01-06 01:33:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:11.195387 | orchestrator | 2026-01-06 01:33:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:11.196947 | orchestrator | 2026-01-06 01:33:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:11.197143 | orchestrator | 2026-01-06 01:33:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:14.251616 | orchestrator | 2026-01-06 01:33:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:14.252764 | orchestrator | 2026-01-06 01:33:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:14.252799 | orchestrator | 2026-01-06 01:33:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:17.304728 | orchestrator | 2026-01-06 01:33:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:17.306251 | orchestrator | 2026-01-06 01:33:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:17.306340 | orchestrator | 2026-01-06 01:33:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:20.352110 | orchestrator | 2026-01-06 01:33:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:20.354327 | orchestrator | 2026-01-06 01:33:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:20.354384 | orchestrator | 2026-01-06 01:33:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:23.396439 | orchestrator | 2026-01-06 01:33:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:23.397948 | orchestrator | 2026-01-06 01:33:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:23.398113 | orchestrator | 2026-01-06 01:33:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:26.441702 | orchestrator | 2026-01-06 01:33:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:26.442711 | orchestrator | 2026-01-06 01:33:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:26.442791 | orchestrator | 2026-01-06 01:33:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:29.494744 | orchestrator | 2026-01-06 01:33:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:29.496713 | orchestrator | 2026-01-06 01:33:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:29.496800 | orchestrator | 2026-01-06 01:33:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:32.541792 | orchestrator | 2026-01-06 01:33:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:32.545050 | orchestrator | 2026-01-06 01:33:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:32.545136 | orchestrator | 2026-01-06 01:33:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:35.595261 | orchestrator | 2026-01-06 01:33:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:35.596682 | orchestrator | 2026-01-06 01:33:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:35.596751 | orchestrator | 2026-01-06 01:33:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:38.647857 | orchestrator | 2026-01-06 01:33:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:38.649223 | orchestrator | 2026-01-06 01:33:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:38.649279 | orchestrator | 2026-01-06 01:33:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:41.699355 | orchestrator | 2026-01-06 01:33:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:41.702127 | orchestrator | 2026-01-06 01:33:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:41.702206 | orchestrator | 2026-01-06 01:33:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:44.749688 | orchestrator | 2026-01-06 01:33:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:44.751127 | orchestrator | 2026-01-06 01:33:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:44.751161 | orchestrator | 2026-01-06 01:33:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:47.796504 | orchestrator | 2026-01-06 01:33:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:47.798186 | orchestrator | 2026-01-06 01:33:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:47.798241 | orchestrator | 2026-01-06 01:33:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:50.848611 | orchestrator | 2026-01-06 01:33:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:50.850166 | orchestrator | 2026-01-06 01:33:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:50.850256 | orchestrator | 2026-01-06 01:33:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:53.897336 | orchestrator | 2026-01-06 01:33:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:53.900444 | orchestrator | 2026-01-06 01:33:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:53.900577 | orchestrator | 2026-01-06 01:33:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:56.941691 | orchestrator | 2026-01-06 01:33:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:56.942727 | orchestrator | 2026-01-06 01:33:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:56.942774 | orchestrator | 2026-01-06 01:33:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:33:59.992969 | orchestrator | 2026-01-06 01:33:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:33:59.994371 | orchestrator | 2026-01-06 01:33:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:33:59.994444 | orchestrator | 2026-01-06 01:33:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:03.042756 | orchestrator | 2026-01-06 01:34:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:03.046251 | orchestrator | 2026-01-06 01:34:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:03.046424 | orchestrator | 2026-01-06 01:34:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:06.089006 | orchestrator | 2026-01-06 01:34:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:06.091000 | orchestrator | 2026-01-06 01:34:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:06.091112 | orchestrator | 2026-01-06 01:34:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:09.143792 | orchestrator | 2026-01-06 01:34:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:09.146616 | orchestrator | 2026-01-06 01:34:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:09.146877 | orchestrator | 2026-01-06 01:34:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:12.194477 | orchestrator | 2026-01-06 01:34:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:12.196285 | orchestrator | 2026-01-06 01:34:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:12.196346 | orchestrator | 2026-01-06 01:34:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:15.240329 | orchestrator | 2026-01-06 01:34:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:15.241281 | orchestrator | 2026-01-06 01:34:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:15.241313 | orchestrator | 2026-01-06 01:34:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:18.287488 | orchestrator | 2026-01-06 01:34:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:18.290741 | orchestrator | 2026-01-06 01:34:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:18.290894 | orchestrator | 2026-01-06 01:34:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:21.338480 | orchestrator | 2026-01-06 01:34:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:21.340232 | orchestrator | 2026-01-06 01:34:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:21.340282 | orchestrator | 2026-01-06 01:34:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:24.394345 | orchestrator | 2026-01-06 01:34:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:24.395977 | orchestrator | 2026-01-06 01:34:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:24.396064 | orchestrator | 2026-01-06 01:34:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:27.444649 | orchestrator | 2026-01-06 01:34:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:27.445493 | orchestrator | 2026-01-06 01:34:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:27.445521 | orchestrator | 2026-01-06 01:34:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:30.489625 | orchestrator | 2026-01-06 01:34:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:30.490967 | orchestrator | 2026-01-06 01:34:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:30.491016 | orchestrator | 2026-01-06 01:34:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:33.549794 | orchestrator | 2026-01-06 01:34:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:33.552406 | orchestrator | 2026-01-06 01:34:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:33.552478 | orchestrator | 2026-01-06 01:34:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:36.599447 | orchestrator | 2026-01-06 01:34:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:36.600522 | orchestrator | 2026-01-06 01:34:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:36.600626 | orchestrator | 2026-01-06 01:34:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:39.646983 | orchestrator | 2026-01-06 01:34:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:39.649325 | orchestrator | 2026-01-06 01:34:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:39.649390 | orchestrator | 2026-01-06 01:34:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:42.696392 | orchestrator | 2026-01-06 01:34:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:42.697373 | orchestrator | 2026-01-06 01:34:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:42.697443 | orchestrator | 2026-01-06 01:34:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:45.745202 | orchestrator | 2026-01-06 01:34:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:45.747060 | orchestrator | 2026-01-06 01:34:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:45.747145 | orchestrator | 2026-01-06 01:34:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:48.795013 | orchestrator | 2026-01-06 01:34:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:48.796459 | orchestrator | 2026-01-06 01:34:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:48.796530 | orchestrator | 2026-01-06 01:34:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:51.844045 | orchestrator | 2026-01-06 01:34:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:51.845883 | orchestrator | 2026-01-06 01:34:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:51.846113 | orchestrator | 2026-01-06 01:34:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:54.895264 | orchestrator | 2026-01-06 01:34:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:54.896443 | orchestrator | 2026-01-06 01:34:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:54.896477 | orchestrator | 2026-01-06 01:34:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:34:57.946508 | orchestrator | 2026-01-06 01:34:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:34:57.948280 | orchestrator | 2026-01-06 01:34:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:34:57.948352 | orchestrator | 2026-01-06 01:34:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:01.001220 | orchestrator | 2026-01-06 01:35:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:01.002897 | orchestrator | 2026-01-06 01:35:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:01.002944 | orchestrator | 2026-01-06 01:35:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:04.047337 | orchestrator | 2026-01-06 01:35:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:04.048128 | orchestrator | 2026-01-06 01:35:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:04.048181 | orchestrator | 2026-01-06 01:35:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:07.097429 | orchestrator | 2026-01-06 01:35:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:07.099470 | orchestrator | 2026-01-06 01:35:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:07.099803 | orchestrator | 2026-01-06 01:35:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:10.143124 | orchestrator | 2026-01-06 01:35:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:10.144978 | orchestrator | 2026-01-06 01:35:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:10.145047 | orchestrator | 2026-01-06 01:35:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:13.190436 | orchestrator | 2026-01-06 01:35:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:13.192973 | orchestrator | 2026-01-06 01:35:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:13.193050 | orchestrator | 2026-01-06 01:35:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:16.247247 | orchestrator | 2026-01-06 01:35:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:16.249478 | orchestrator | 2026-01-06 01:35:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:16.249524 | orchestrator | 2026-01-06 01:35:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:19.302303 | orchestrator | 2026-01-06 01:35:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:19.304509 | orchestrator | 2026-01-06 01:35:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:19.304790 | orchestrator | 2026-01-06 01:35:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:22.352503 | orchestrator | 2026-01-06 01:35:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:22.354881 | orchestrator | 2026-01-06 01:35:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:22.354958 | orchestrator | 2026-01-06 01:35:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:25.406393 | orchestrator | 2026-01-06 01:35:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:25.406823 | orchestrator | 2026-01-06 01:35:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:25.406904 | orchestrator | 2026-01-06 01:35:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:28.455395 | orchestrator | 2026-01-06 01:35:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:28.458669 | orchestrator | 2026-01-06 01:35:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:28.458812 | orchestrator | 2026-01-06 01:35:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:31.510500 | orchestrator | 2026-01-06 01:35:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:31.514369 | orchestrator | 2026-01-06 01:35:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:31.514452 | orchestrator | 2026-01-06 01:35:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:34.561317 | orchestrator | 2026-01-06 01:35:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:34.562462 | orchestrator | 2026-01-06 01:35:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:34.562541 | orchestrator | 2026-01-06 01:35:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:37.606597 | orchestrator | 2026-01-06 01:35:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:37.608239 | orchestrator | 2026-01-06 01:35:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:37.608343 | orchestrator | 2026-01-06 01:35:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:40.658297 | orchestrator | 2026-01-06 01:35:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:40.660689 | orchestrator | 2026-01-06 01:35:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:40.660819 | orchestrator | 2026-01-06 01:35:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:43.707637 | orchestrator | 2026-01-06 01:35:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:43.709398 | orchestrator | 2026-01-06 01:35:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:43.709463 | orchestrator | 2026-01-06 01:35:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:46.758450 | orchestrator | 2026-01-06 01:35:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:46.759592 | orchestrator | 2026-01-06 01:35:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:46.759721 | orchestrator | 2026-01-06 01:35:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:49.814443 | orchestrator | 2026-01-06 01:35:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:49.816864 | orchestrator | 2026-01-06 01:35:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:49.817075 | orchestrator | 2026-01-06 01:35:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:52.867776 | orchestrator | 2026-01-06 01:35:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:52.868120 | orchestrator | 2026-01-06 01:35:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:52.868274 | orchestrator | 2026-01-06 01:35:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:55.932449 | orchestrator | 2026-01-06 01:35:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:55.935723 | orchestrator | 2026-01-06 01:35:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:55.936060 | orchestrator | 2026-01-06 01:35:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:35:58.990381 | orchestrator | 2026-01-06 01:35:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:35:58.992715 | orchestrator | 2026-01-06 01:35:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:35:58.992756 | orchestrator | 2026-01-06 01:35:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:02.044283 | orchestrator | 2026-01-06 01:36:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:02.046924 | orchestrator | 2026-01-06 01:36:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:02.046985 | orchestrator | 2026-01-06 01:36:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:05.094499 | orchestrator | 2026-01-06 01:36:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:05.099026 | orchestrator | 2026-01-06 01:36:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:05.099154 | orchestrator | 2026-01-06 01:36:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:08.146267 | orchestrator | 2026-01-06 01:36:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:08.147827 | orchestrator | 2026-01-06 01:36:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:08.147927 | orchestrator | 2026-01-06 01:36:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:11.194361 | orchestrator | 2026-01-06 01:36:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:11.195323 | orchestrator | 2026-01-06 01:36:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:11.195436 | orchestrator | 2026-01-06 01:36:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:14.242994 | orchestrator | 2026-01-06 01:36:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:14.245231 | orchestrator | 2026-01-06 01:36:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:14.245351 | orchestrator | 2026-01-06 01:36:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:17.290978 | orchestrator | 2026-01-06 01:36:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:17.294252 | orchestrator | 2026-01-06 01:36:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:17.294346 | orchestrator | 2026-01-06 01:36:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:20.340420 | orchestrator | 2026-01-06 01:36:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:20.342543 | orchestrator | 2026-01-06 01:36:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:20.342683 | orchestrator | 2026-01-06 01:36:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:23.388021 | orchestrator | 2026-01-06 01:36:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:23.389343 | orchestrator | 2026-01-06 01:36:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:23.389423 | orchestrator | 2026-01-06 01:36:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:26.436655 | orchestrator | 2026-01-06 01:36:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:26.437419 | orchestrator | 2026-01-06 01:36:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:26.437442 | orchestrator | 2026-01-06 01:36:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:29.486397 | orchestrator | 2026-01-06 01:36:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:29.487792 | orchestrator | 2026-01-06 01:36:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:29.487837 | orchestrator | 2026-01-06 01:36:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:32.538599 | orchestrator | 2026-01-06 01:36:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:32.540559 | orchestrator | 2026-01-06 01:36:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:32.540625 | orchestrator | 2026-01-06 01:36:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:35.585779 | orchestrator | 2026-01-06 01:36:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:35.586749 | orchestrator | 2026-01-06 01:36:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:35.586781 | orchestrator | 2026-01-06 01:36:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:38.628247 | orchestrator | 2026-01-06 01:36:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:38.630286 | orchestrator | 2026-01-06 01:36:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:38.630416 | orchestrator | 2026-01-06 01:36:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:41.678683 | orchestrator | 2026-01-06 01:36:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:41.679911 | orchestrator | 2026-01-06 01:36:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:41.679944 | orchestrator | 2026-01-06 01:36:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:44.724630 | orchestrator | 2026-01-06 01:36:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:44.726943 | orchestrator | 2026-01-06 01:36:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:44.727005 | orchestrator | 2026-01-06 01:36:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:47.775373 | orchestrator | 2026-01-06 01:36:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:47.777308 | orchestrator | 2026-01-06 01:36:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:47.777590 | orchestrator | 2026-01-06 01:36:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:50.825225 | orchestrator | 2026-01-06 01:36:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:50.826415 | orchestrator | 2026-01-06 01:36:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:50.826442 | orchestrator | 2026-01-06 01:36:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:53.873728 | orchestrator | 2026-01-06 01:36:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:53.875731 | orchestrator | 2026-01-06 01:36:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:53.875805 | orchestrator | 2026-01-06 01:36:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:56.919823 | orchestrator | 2026-01-06 01:36:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:56.921693 | orchestrator | 2026-01-06 01:36:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:56.921761 | orchestrator | 2026-01-06 01:36:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:36:59.977959 | orchestrator | 2026-01-06 01:36:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:36:59.979130 | orchestrator | 2026-01-06 01:36:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:36:59.979218 | orchestrator | 2026-01-06 01:36:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:03.022908 | orchestrator | 2026-01-06 01:37:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:03.025070 | orchestrator | 2026-01-06 01:37:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:03.025159 | orchestrator | 2026-01-06 01:37:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:06.075706 | orchestrator | 2026-01-06 01:37:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:06.078086 | orchestrator | 2026-01-06 01:37:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:06.078183 | orchestrator | 2026-01-06 01:37:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:09.127880 | orchestrator | 2026-01-06 01:37:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:09.129138 | orchestrator | 2026-01-06 01:37:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:09.129192 | orchestrator | 2026-01-06 01:37:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:12.181190 | orchestrator | 2026-01-06 01:37:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:12.183884 | orchestrator | 2026-01-06 01:37:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:12.183951 | orchestrator | 2026-01-06 01:37:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:15.234671 | orchestrator | 2026-01-06 01:37:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:15.236017 | orchestrator | 2026-01-06 01:37:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:15.236194 | orchestrator | 2026-01-06 01:37:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:18.280151 | orchestrator | 2026-01-06 01:37:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:18.281370 | orchestrator | 2026-01-06 01:37:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:18.281492 | orchestrator | 2026-01-06 01:37:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:21.329327 | orchestrator | 2026-01-06 01:37:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:21.330346 | orchestrator | 2026-01-06 01:37:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:21.330583 | orchestrator | 2026-01-06 01:37:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:24.378364 | orchestrator | 2026-01-06 01:37:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:24.380016 | orchestrator | 2026-01-06 01:37:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:24.380060 | orchestrator | 2026-01-06 01:37:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:27.423584 | orchestrator | 2026-01-06 01:37:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:27.425073 | orchestrator | 2026-01-06 01:37:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:27.425147 | orchestrator | 2026-01-06 01:37:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:30.473719 | orchestrator | 2026-01-06 01:37:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:30.475515 | orchestrator | 2026-01-06 01:37:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:30.476391 | orchestrator | 2026-01-06 01:37:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:33.522069 | orchestrator | 2026-01-06 01:37:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:33.523131 | orchestrator | 2026-01-06 01:37:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:33.523214 | orchestrator | 2026-01-06 01:37:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:36.570400 | orchestrator | 2026-01-06 01:37:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:36.572051 | orchestrator | 2026-01-06 01:37:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:36.572092 | orchestrator | 2026-01-06 01:37:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:39.619145 | orchestrator | 2026-01-06 01:37:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:39.619894 | orchestrator | 2026-01-06 01:37:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:39.619927 | orchestrator | 2026-01-06 01:37:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:42.672318 | orchestrator | 2026-01-06 01:37:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:42.674160 | orchestrator | 2026-01-06 01:37:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:42.674211 | orchestrator | 2026-01-06 01:37:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:45.719306 | orchestrator | 2026-01-06 01:37:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:45.720729 | orchestrator | 2026-01-06 01:37:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:45.720770 | orchestrator | 2026-01-06 01:37:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:48.766726 | orchestrator | 2026-01-06 01:37:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:48.769613 | orchestrator | 2026-01-06 01:37:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:48.769699 | orchestrator | 2026-01-06 01:37:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:51.816926 | orchestrator | 2026-01-06 01:37:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:51.818663 | orchestrator | 2026-01-06 01:37:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:51.818739 | orchestrator | 2026-01-06 01:37:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:54.872022 | orchestrator | 2026-01-06 01:37:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:54.874430 | orchestrator | 2026-01-06 01:37:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:54.874507 | orchestrator | 2026-01-06 01:37:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:37:57.928815 | orchestrator | 2026-01-06 01:37:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:37:57.929641 | orchestrator | 2026-01-06 01:37:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:37:57.929780 | orchestrator | 2026-01-06 01:37:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:00.982705 | orchestrator | 2026-01-06 01:38:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:00.985279 | orchestrator | 2026-01-06 01:38:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:00.985344 | orchestrator | 2026-01-06 01:38:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:04.035007 | orchestrator | 2026-01-06 01:38:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:04.035137 | orchestrator | 2026-01-06 01:38:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:04.035190 | orchestrator | 2026-01-06 01:38:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:07.089190 | orchestrator | 2026-01-06 01:38:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:07.091612 | orchestrator | 2026-01-06 01:38:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:07.091695 | orchestrator | 2026-01-06 01:38:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:10.137271 | orchestrator | 2026-01-06 01:38:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:10.138325 | orchestrator | 2026-01-06 01:38:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:10.138365 | orchestrator | 2026-01-06 01:38:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:13.187649 | orchestrator | 2026-01-06 01:38:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:13.189378 | orchestrator | 2026-01-06 01:38:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:13.189428 | orchestrator | 2026-01-06 01:38:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:16.238814 | orchestrator | 2026-01-06 01:38:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:16.241036 | orchestrator | 2026-01-06 01:38:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:16.241094 | orchestrator | 2026-01-06 01:38:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:19.291527 | orchestrator | 2026-01-06 01:38:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:19.293052 | orchestrator | 2026-01-06 01:38:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:19.293139 | orchestrator | 2026-01-06 01:38:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:22.339013 | orchestrator | 2026-01-06 01:38:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:22.341617 | orchestrator | 2026-01-06 01:38:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:22.341752 | orchestrator | 2026-01-06 01:38:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:25.398587 | orchestrator | 2026-01-06 01:38:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:25.401473 | orchestrator | 2026-01-06 01:38:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:25.403350 | orchestrator | 2026-01-06 01:38:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:28.448030 | orchestrator | 2026-01-06 01:38:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:28.449793 | orchestrator | 2026-01-06 01:38:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:28.449842 | orchestrator | 2026-01-06 01:38:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:31.492242 | orchestrator | 2026-01-06 01:38:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:31.493063 | orchestrator | 2026-01-06 01:38:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:31.493103 | orchestrator | 2026-01-06 01:38:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:34.543789 | orchestrator | 2026-01-06 01:38:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:34.546235 | orchestrator | 2026-01-06 01:38:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:34.546292 | orchestrator | 2026-01-06 01:38:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:37.591425 | orchestrator | 2026-01-06 01:38:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:37.591709 | orchestrator | 2026-01-06 01:38:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:37.591750 | orchestrator | 2026-01-06 01:38:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:40.638422 | orchestrator | 2026-01-06 01:38:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:40.640850 | orchestrator | 2026-01-06 01:38:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:40.640924 | orchestrator | 2026-01-06 01:38:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:43.683622 | orchestrator | 2026-01-06 01:38:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:43.686004 | orchestrator | 2026-01-06 01:38:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:43.686585 | orchestrator | 2026-01-06 01:38:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:46.732993 | orchestrator | 2026-01-06 01:38:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:46.735605 | orchestrator | 2026-01-06 01:38:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:46.735660 | orchestrator | 2026-01-06 01:38:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:49.783504 | orchestrator | 2026-01-06 01:38:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:49.784945 | orchestrator | 2026-01-06 01:38:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:49.784973 | orchestrator | 2026-01-06 01:38:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:52.839069 | orchestrator | 2026-01-06 01:38:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:52.842275 | orchestrator | 2026-01-06 01:38:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:52.842456 | orchestrator | 2026-01-06 01:38:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:55.893653 | orchestrator | 2026-01-06 01:38:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:55.895662 | orchestrator | 2026-01-06 01:38:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:55.895695 | orchestrator | 2026-01-06 01:38:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:38:58.943517 | orchestrator | 2026-01-06 01:38:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:38:58.945306 | orchestrator | 2026-01-06 01:38:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:38:58.945369 | orchestrator | 2026-01-06 01:38:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:01.992850 | orchestrator | 2026-01-06 01:39:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:01.994276 | orchestrator | 2026-01-06 01:39:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:01.994327 | orchestrator | 2026-01-06 01:39:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:05.060442 | orchestrator | 2026-01-06 01:39:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:05.062139 | orchestrator | 2026-01-06 01:39:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:05.062355 | orchestrator | 2026-01-06 01:39:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:08.106826 | orchestrator | 2026-01-06 01:39:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:08.109932 | orchestrator | 2026-01-06 01:39:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:08.110000 | orchestrator | 2026-01-06 01:39:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:11.161502 | orchestrator | 2026-01-06 01:39:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:11.163072 | orchestrator | 2026-01-06 01:39:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:11.163185 | orchestrator | 2026-01-06 01:39:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:14.206588 | orchestrator | 2026-01-06 01:39:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:14.207668 | orchestrator | 2026-01-06 01:39:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:14.207697 | orchestrator | 2026-01-06 01:39:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:17.253856 | orchestrator | 2026-01-06 01:39:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:17.256187 | orchestrator | 2026-01-06 01:39:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:17.256260 | orchestrator | 2026-01-06 01:39:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:20.306484 | orchestrator | 2026-01-06 01:39:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:20.307748 | orchestrator | 2026-01-06 01:39:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:20.307817 | orchestrator | 2026-01-06 01:39:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:23.354179 | orchestrator | 2026-01-06 01:39:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:23.355341 | orchestrator | 2026-01-06 01:39:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:23.355395 | orchestrator | 2026-01-06 01:39:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:26.404732 | orchestrator | 2026-01-06 01:39:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:26.407061 | orchestrator | 2026-01-06 01:39:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:26.407151 | orchestrator | 2026-01-06 01:39:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:29.450206 | orchestrator | 2026-01-06 01:39:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:29.451915 | orchestrator | 2026-01-06 01:39:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:29.452011 | orchestrator | 2026-01-06 01:39:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:32.498939 | orchestrator | 2026-01-06 01:39:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:32.500534 | orchestrator | 2026-01-06 01:39:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:32.500620 | orchestrator | 2026-01-06 01:39:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:35.546769 | orchestrator | 2026-01-06 01:39:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:35.549104 | orchestrator | 2026-01-06 01:39:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:35.549156 | orchestrator | 2026-01-06 01:39:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:38.583649 | orchestrator | 2026-01-06 01:39:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:38.586189 | orchestrator | 2026-01-06 01:39:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:38.586737 | orchestrator | 2026-01-06 01:39:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:41.635491 | orchestrator | 2026-01-06 01:39:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:41.637005 | orchestrator | 2026-01-06 01:39:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:41.637216 | orchestrator | 2026-01-06 01:39:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:44.688344 | orchestrator | 2026-01-06 01:39:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:44.691024 | orchestrator | 2026-01-06 01:39:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:44.691092 | orchestrator | 2026-01-06 01:39:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:47.738752 | orchestrator | 2026-01-06 01:39:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:47.740760 | orchestrator | 2026-01-06 01:39:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:47.740819 | orchestrator | 2026-01-06 01:39:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:50.785546 | orchestrator | 2026-01-06 01:39:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:50.786863 | orchestrator | 2026-01-06 01:39:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:50.787034 | orchestrator | 2026-01-06 01:39:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:53.831716 | orchestrator | 2026-01-06 01:39:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:53.833030 | orchestrator | 2026-01-06 01:39:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:53.833094 | orchestrator | 2026-01-06 01:39:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:56.880980 | orchestrator | 2026-01-06 01:39:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:56.883336 | orchestrator | 2026-01-06 01:39:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:56.883395 | orchestrator | 2026-01-06 01:39:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:39:59.928737 | orchestrator | 2026-01-06 01:39:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:39:59.931219 | orchestrator | 2026-01-06 01:39:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:39:59.931306 | orchestrator | 2026-01-06 01:39:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:02.980934 | orchestrator | 2026-01-06 01:40:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:02.982785 | orchestrator | 2026-01-06 01:40:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:02.982832 | orchestrator | 2026-01-06 01:40:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:06.037202 | orchestrator | 2026-01-06 01:40:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:06.038963 | orchestrator | 2026-01-06 01:40:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:06.039089 | orchestrator | 2026-01-06 01:40:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:09.082518 | orchestrator | 2026-01-06 01:40:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:09.083420 | orchestrator | 2026-01-06 01:40:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:09.083461 | orchestrator | 2026-01-06 01:40:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:12.129361 | orchestrator | 2026-01-06 01:40:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:12.130966 | orchestrator | 2026-01-06 01:40:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:12.131058 | orchestrator | 2026-01-06 01:40:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:15.184609 | orchestrator | 2026-01-06 01:40:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:15.187441 | orchestrator | 2026-01-06 01:40:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:15.187513 | orchestrator | 2026-01-06 01:40:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:18.235616 | orchestrator | 2026-01-06 01:40:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:18.237498 | orchestrator | 2026-01-06 01:40:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:18.237737 | orchestrator | 2026-01-06 01:40:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:21.283770 | orchestrator | 2026-01-06 01:40:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:21.285564 | orchestrator | 2026-01-06 01:40:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:21.285633 | orchestrator | 2026-01-06 01:40:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:24.328711 | orchestrator | 2026-01-06 01:40:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:24.330946 | orchestrator | 2026-01-06 01:40:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:24.331103 | orchestrator | 2026-01-06 01:40:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:27.371392 | orchestrator | 2026-01-06 01:40:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:27.372381 | orchestrator | 2026-01-06 01:40:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:27.372414 | orchestrator | 2026-01-06 01:40:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:30.425083 | orchestrator | 2026-01-06 01:40:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:30.425751 | orchestrator | 2026-01-06 01:40:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:30.425798 | orchestrator | 2026-01-06 01:40:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:33.471846 | orchestrator | 2026-01-06 01:40:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:33.472825 | orchestrator | 2026-01-06 01:40:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:33.472884 | orchestrator | 2026-01-06 01:40:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:36.525012 | orchestrator | 2026-01-06 01:40:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:36.526733 | orchestrator | 2026-01-06 01:40:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:36.526813 | orchestrator | 2026-01-06 01:40:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:39.572913 | orchestrator | 2026-01-06 01:40:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:39.574475 | orchestrator | 2026-01-06 01:40:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:39.574510 | orchestrator | 2026-01-06 01:40:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:42.624469 | orchestrator | 2026-01-06 01:40:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:42.626771 | orchestrator | 2026-01-06 01:40:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:42.626867 | orchestrator | 2026-01-06 01:40:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:45.681713 | orchestrator | 2026-01-06 01:40:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:45.682354 | orchestrator | 2026-01-06 01:40:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:45.682408 | orchestrator | 2026-01-06 01:40:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:48.727510 | orchestrator | 2026-01-06 01:40:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:48.729038 | orchestrator | 2026-01-06 01:40:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:48.729093 | orchestrator | 2026-01-06 01:40:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:51.781825 | orchestrator | 2026-01-06 01:40:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:51.782414 | orchestrator | 2026-01-06 01:40:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:51.782535 | orchestrator | 2026-01-06 01:40:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:54.828165 | orchestrator | 2026-01-06 01:40:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:54.829527 | orchestrator | 2026-01-06 01:40:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:54.829667 | orchestrator | 2026-01-06 01:40:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:40:57.872399 | orchestrator | 2026-01-06 01:40:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:40:57.874351 | orchestrator | 2026-01-06 01:40:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:40:57.874417 | orchestrator | 2026-01-06 01:40:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:00.922997 | orchestrator | 2026-01-06 01:41:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:00.923849 | orchestrator | 2026-01-06 01:41:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:00.923896 | orchestrator | 2026-01-06 01:41:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:03.969056 | orchestrator | 2026-01-06 01:41:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:03.970451 | orchestrator | 2026-01-06 01:41:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:03.970496 | orchestrator | 2026-01-06 01:41:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:07.018855 | orchestrator | 2026-01-06 01:41:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:07.020627 | orchestrator | 2026-01-06 01:41:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:07.020652 | orchestrator | 2026-01-06 01:41:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:10.072361 | orchestrator | 2026-01-06 01:41:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:10.075023 | orchestrator | 2026-01-06 01:41:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:10.075185 | orchestrator | 2026-01-06 01:41:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:13.126356 | orchestrator | 2026-01-06 01:41:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:13.128030 | orchestrator | 2026-01-06 01:41:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:13.128054 | orchestrator | 2026-01-06 01:41:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:16.176266 | orchestrator | 2026-01-06 01:41:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:16.180198 | orchestrator | 2026-01-06 01:41:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:16.180350 | orchestrator | 2026-01-06 01:41:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:19.226348 | orchestrator | 2026-01-06 01:41:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:19.228441 | orchestrator | 2026-01-06 01:41:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:19.228559 | orchestrator | 2026-01-06 01:41:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:22.285289 | orchestrator | 2026-01-06 01:41:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:22.288528 | orchestrator | 2026-01-06 01:41:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:22.288572 | orchestrator | 2026-01-06 01:41:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:25.337900 | orchestrator | 2026-01-06 01:41:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:25.340660 | orchestrator | 2026-01-06 01:41:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:25.340709 | orchestrator | 2026-01-06 01:41:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:28.389722 | orchestrator | 2026-01-06 01:41:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:28.390926 | orchestrator | 2026-01-06 01:41:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:28.390963 | orchestrator | 2026-01-06 01:41:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:31.439343 | orchestrator | 2026-01-06 01:41:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:31.440464 | orchestrator | 2026-01-06 01:41:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:31.440529 | orchestrator | 2026-01-06 01:41:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:34.488217 | orchestrator | 2026-01-06 01:41:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:34.489528 | orchestrator | 2026-01-06 01:41:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:34.489569 | orchestrator | 2026-01-06 01:41:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:37.534375 | orchestrator | 2026-01-06 01:41:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:37.535090 | orchestrator | 2026-01-06 01:41:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:37.535206 | orchestrator | 2026-01-06 01:41:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:40.582753 | orchestrator | 2026-01-06 01:41:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:40.584183 | orchestrator | 2026-01-06 01:41:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:40.584265 | orchestrator | 2026-01-06 01:41:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:43.628312 | orchestrator | 2026-01-06 01:41:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:43.629521 | orchestrator | 2026-01-06 01:41:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:43.629562 | orchestrator | 2026-01-06 01:41:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:46.677610 | orchestrator | 2026-01-06 01:41:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:46.679743 | orchestrator | 2026-01-06 01:41:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:46.679957 | orchestrator | 2026-01-06 01:41:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:49.727756 | orchestrator | 2026-01-06 01:41:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:49.728829 | orchestrator | 2026-01-06 01:41:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:49.728937 | orchestrator | 2026-01-06 01:41:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:52.779118 | orchestrator | 2026-01-06 01:41:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:52.780971 | orchestrator | 2026-01-06 01:41:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:52.781009 | orchestrator | 2026-01-06 01:41:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:55.830718 | orchestrator | 2026-01-06 01:41:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:55.832939 | orchestrator | 2026-01-06 01:41:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:55.833093 | orchestrator | 2026-01-06 01:41:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:41:58.878965 | orchestrator | 2026-01-06 01:41:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:41:58.880914 | orchestrator | 2026-01-06 01:41:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:41:58.881020 | orchestrator | 2026-01-06 01:41:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:01.927937 | orchestrator | 2026-01-06 01:42:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:01.929342 | orchestrator | 2026-01-06 01:42:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:01.929436 | orchestrator | 2026-01-06 01:42:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:04.978656 | orchestrator | 2026-01-06 01:42:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:04.980316 | orchestrator | 2026-01-06 01:42:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:04.980406 | orchestrator | 2026-01-06 01:42:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:08.033082 | orchestrator | 2026-01-06 01:42:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:08.035564 | orchestrator | 2026-01-06 01:42:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:08.035627 | orchestrator | 2026-01-06 01:42:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:11.089384 | orchestrator | 2026-01-06 01:42:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:11.090608 | orchestrator | 2026-01-06 01:42:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:11.091331 | orchestrator | 2026-01-06 01:42:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:14.144616 | orchestrator | 2026-01-06 01:42:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:14.146620 | orchestrator | 2026-01-06 01:42:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:14.146656 | orchestrator | 2026-01-06 01:42:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:17.203085 | orchestrator | 2026-01-06 01:42:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:17.204856 | orchestrator | 2026-01-06 01:42:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:17.204996 | orchestrator | 2026-01-06 01:42:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:20.248645 | orchestrator | 2026-01-06 01:42:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:20.251011 | orchestrator | 2026-01-06 01:42:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:20.251343 | orchestrator | 2026-01-06 01:42:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:23.300082 | orchestrator | 2026-01-06 01:42:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:23.301423 | orchestrator | 2026-01-06 01:42:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:23.301463 | orchestrator | 2026-01-06 01:42:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:26.342428 | orchestrator | 2026-01-06 01:42:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:26.343686 | orchestrator | 2026-01-06 01:42:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:26.343717 | orchestrator | 2026-01-06 01:42:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:29.392663 | orchestrator | 2026-01-06 01:42:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:29.394731 | orchestrator | 2026-01-06 01:42:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:29.394790 | orchestrator | 2026-01-06 01:42:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:32.449446 | orchestrator | 2026-01-06 01:42:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:32.451799 | orchestrator | 2026-01-06 01:42:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:32.451856 | orchestrator | 2026-01-06 01:42:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:35.502100 | orchestrator | 2026-01-06 01:42:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:35.504185 | orchestrator | 2026-01-06 01:42:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:35.504569 | orchestrator | 2026-01-06 01:42:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:38.563328 | orchestrator | 2026-01-06 01:42:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:38.565768 | orchestrator | 2026-01-06 01:42:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:38.565816 | orchestrator | 2026-01-06 01:42:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:41.613354 | orchestrator | 2026-01-06 01:42:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:41.615788 | orchestrator | 2026-01-06 01:42:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:41.615847 | orchestrator | 2026-01-06 01:42:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:44.664949 | orchestrator | 2026-01-06 01:42:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:44.668328 | orchestrator | 2026-01-06 01:42:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:44.668765 | orchestrator | 2026-01-06 01:42:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:47.714975 | orchestrator | 2026-01-06 01:42:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:47.717987 | orchestrator | 2026-01-06 01:42:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:47.718272 | orchestrator | 2026-01-06 01:42:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:50.766451 | orchestrator | 2026-01-06 01:42:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:50.767822 | orchestrator | 2026-01-06 01:42:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:50.767870 | orchestrator | 2026-01-06 01:42:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:53.812247 | orchestrator | 2026-01-06 01:42:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:53.813536 | orchestrator | 2026-01-06 01:42:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:53.813577 | orchestrator | 2026-01-06 01:42:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:56.860764 | orchestrator | 2026-01-06 01:42:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:56.861281 | orchestrator | 2026-01-06 01:42:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:56.861320 | orchestrator | 2026-01-06 01:42:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:42:59.912261 | orchestrator | 2026-01-06 01:42:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:42:59.912881 | orchestrator | 2026-01-06 01:42:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:42:59.912907 | orchestrator | 2026-01-06 01:42:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:02.960666 | orchestrator | 2026-01-06 01:43:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:02.963648 | orchestrator | 2026-01-06 01:43:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:02.963704 | orchestrator | 2026-01-06 01:43:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:06.017915 | orchestrator | 2026-01-06 01:43:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:06.019336 | orchestrator | 2026-01-06 01:43:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:06.019390 | orchestrator | 2026-01-06 01:43:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:09.073454 | orchestrator | 2026-01-06 01:43:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:09.073904 | orchestrator | 2026-01-06 01:43:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:09.073929 | orchestrator | 2026-01-06 01:43:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:12.123343 | orchestrator | 2026-01-06 01:43:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:12.124866 | orchestrator | 2026-01-06 01:43:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:12.124934 | orchestrator | 2026-01-06 01:43:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:15.172764 | orchestrator | 2026-01-06 01:43:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:15.174893 | orchestrator | 2026-01-06 01:43:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:15.174946 | orchestrator | 2026-01-06 01:43:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:18.222178 | orchestrator | 2026-01-06 01:43:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:18.224793 | orchestrator | 2026-01-06 01:43:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:18.224840 | orchestrator | 2026-01-06 01:43:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:21.276494 | orchestrator | 2026-01-06 01:43:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:21.278712 | orchestrator | 2026-01-06 01:43:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:21.278794 | orchestrator | 2026-01-06 01:43:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:24.325264 | orchestrator | 2026-01-06 01:43:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:24.326953 | orchestrator | 2026-01-06 01:43:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:24.327021 | orchestrator | 2026-01-06 01:43:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:27.373120 | orchestrator | 2026-01-06 01:43:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:27.373359 | orchestrator | 2026-01-06 01:43:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:27.373381 | orchestrator | 2026-01-06 01:43:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:30.418449 | orchestrator | 2026-01-06 01:43:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:30.419841 | orchestrator | 2026-01-06 01:43:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:30.419880 | orchestrator | 2026-01-06 01:43:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:33.469200 | orchestrator | 2026-01-06 01:43:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:33.470497 | orchestrator | 2026-01-06 01:43:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:33.470534 | orchestrator | 2026-01-06 01:43:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:36.521924 | orchestrator | 2026-01-06 01:43:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:36.524499 | orchestrator | 2026-01-06 01:43:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:36.524555 | orchestrator | 2026-01-06 01:43:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:39.573601 | orchestrator | 2026-01-06 01:43:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:39.574956 | orchestrator | 2026-01-06 01:43:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:39.574990 | orchestrator | 2026-01-06 01:43:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:42.618776 | orchestrator | 2026-01-06 01:43:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:42.621501 | orchestrator | 2026-01-06 01:43:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:42.621555 | orchestrator | 2026-01-06 01:43:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:45.665930 | orchestrator | 2026-01-06 01:43:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:45.668241 | orchestrator | 2026-01-06 01:43:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:45.668296 | orchestrator | 2026-01-06 01:43:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:48.710332 | orchestrator | 2026-01-06 01:43:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:48.711762 | orchestrator | 2026-01-06 01:43:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:48.711800 | orchestrator | 2026-01-06 01:43:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:51.758906 | orchestrator | 2026-01-06 01:43:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:51.761037 | orchestrator | 2026-01-06 01:43:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:51.761184 | orchestrator | 2026-01-06 01:43:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:54.801809 | orchestrator | 2026-01-06 01:43:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:54.803450 | orchestrator | 2026-01-06 01:43:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:54.803501 | orchestrator | 2026-01-06 01:43:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:43:57.847482 | orchestrator | 2026-01-06 01:43:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:43:57.849678 | orchestrator | 2026-01-06 01:43:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:43:57.849739 | orchestrator | 2026-01-06 01:43:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:00.897023 | orchestrator | 2026-01-06 01:44:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:00.898245 | orchestrator | 2026-01-06 01:44:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:00.898368 | orchestrator | 2026-01-06 01:44:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:03.943678 | orchestrator | 2026-01-06 01:44:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:03.944770 | orchestrator | 2026-01-06 01:44:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:03.944814 | orchestrator | 2026-01-06 01:44:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:07.000740 | orchestrator | 2026-01-06 01:44:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:07.002324 | orchestrator | 2026-01-06 01:44:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:07.002704 | orchestrator | 2026-01-06 01:44:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:10.049417 | orchestrator | 2026-01-06 01:44:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:10.051841 | orchestrator | 2026-01-06 01:44:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:10.051902 | orchestrator | 2026-01-06 01:44:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:13.098008 | orchestrator | 2026-01-06 01:44:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:13.098903 | orchestrator | 2026-01-06 01:44:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:13.098946 | orchestrator | 2026-01-06 01:44:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:16.157652 | orchestrator | 2026-01-06 01:44:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:16.159518 | orchestrator | 2026-01-06 01:44:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:16.159609 | orchestrator | 2026-01-06 01:44:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:19.204695 | orchestrator | 2026-01-06 01:44:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:19.206471 | orchestrator | 2026-01-06 01:44:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:19.206765 | orchestrator | 2026-01-06 01:44:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:22.249537 | orchestrator | 2026-01-06 01:44:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:22.252234 | orchestrator | 2026-01-06 01:44:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:22.252389 | orchestrator | 2026-01-06 01:44:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:25.299782 | orchestrator | 2026-01-06 01:44:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:25.302794 | orchestrator | 2026-01-06 01:44:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:25.302893 | orchestrator | 2026-01-06 01:44:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:28.350999 | orchestrator | 2026-01-06 01:44:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:28.353962 | orchestrator | 2026-01-06 01:44:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:28.354005 | orchestrator | 2026-01-06 01:44:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:31.408207 | orchestrator | 2026-01-06 01:44:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:31.410166 | orchestrator | 2026-01-06 01:44:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:31.410228 | orchestrator | 2026-01-06 01:44:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:34.457257 | orchestrator | 2026-01-06 01:44:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:34.458171 | orchestrator | 2026-01-06 01:44:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:34.458200 | orchestrator | 2026-01-06 01:44:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:37.507224 | orchestrator | 2026-01-06 01:44:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:37.509079 | orchestrator | 2026-01-06 01:44:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:37.509178 | orchestrator | 2026-01-06 01:44:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:40.553942 | orchestrator | 2026-01-06 01:44:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:40.554738 | orchestrator | 2026-01-06 01:44:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:40.554793 | orchestrator | 2026-01-06 01:44:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:43.600852 | orchestrator | 2026-01-06 01:44:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:43.601926 | orchestrator | 2026-01-06 01:44:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:43.602695 | orchestrator | 2026-01-06 01:44:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:46.654510 | orchestrator | 2026-01-06 01:44:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:46.655800 | orchestrator | 2026-01-06 01:44:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:46.655919 | orchestrator | 2026-01-06 01:44:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:49.700061 | orchestrator | 2026-01-06 01:44:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:49.700921 | orchestrator | 2026-01-06 01:44:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:49.700955 | orchestrator | 2026-01-06 01:44:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:52.754649 | orchestrator | 2026-01-06 01:44:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:52.755473 | orchestrator | 2026-01-06 01:44:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:52.755568 | orchestrator | 2026-01-06 01:44:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:55.799585 | orchestrator | 2026-01-06 01:44:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:55.800642 | orchestrator | 2026-01-06 01:44:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:55.800670 | orchestrator | 2026-01-06 01:44:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:44:58.851097 | orchestrator | 2026-01-06 01:44:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:44:58.851924 | orchestrator | 2026-01-06 01:44:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:44:58.851971 | orchestrator | 2026-01-06 01:44:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:01.895684 | orchestrator | 2026-01-06 01:45:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:01.897260 | orchestrator | 2026-01-06 01:45:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:01.897303 | orchestrator | 2026-01-06 01:45:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:04.940914 | orchestrator | 2026-01-06 01:45:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:04.943078 | orchestrator | 2026-01-06 01:45:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:04.943196 | orchestrator | 2026-01-06 01:45:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:07.988295 | orchestrator | 2026-01-06 01:45:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:07.988552 | orchestrator | 2026-01-06 01:45:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:07.988576 | orchestrator | 2026-01-06 01:45:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:11.039758 | orchestrator | 2026-01-06 01:45:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:11.041534 | orchestrator | 2026-01-06 01:45:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:11.041582 | orchestrator | 2026-01-06 01:45:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:14.082572 | orchestrator | 2026-01-06 01:45:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:14.083097 | orchestrator | 2026-01-06 01:45:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:14.083118 | orchestrator | 2026-01-06 01:45:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:17.133657 | orchestrator | 2026-01-06 01:45:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:17.137050 | orchestrator | 2026-01-06 01:45:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:17.137181 | orchestrator | 2026-01-06 01:45:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:20.186164 | orchestrator | 2026-01-06 01:45:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:20.187702 | orchestrator | 2026-01-06 01:45:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:20.187740 | orchestrator | 2026-01-06 01:45:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:23.238885 | orchestrator | 2026-01-06 01:45:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:23.240585 | orchestrator | 2026-01-06 01:45:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:23.240643 | orchestrator | 2026-01-06 01:45:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:26.287636 | orchestrator | 2026-01-06 01:45:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:26.289491 | orchestrator | 2026-01-06 01:45:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:26.289578 | orchestrator | 2026-01-06 01:45:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:29.332724 | orchestrator | 2026-01-06 01:45:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:29.334559 | orchestrator | 2026-01-06 01:45:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:29.334733 | orchestrator | 2026-01-06 01:45:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:32.380855 | orchestrator | 2026-01-06 01:45:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:32.382951 | orchestrator | 2026-01-06 01:45:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:32.383116 | orchestrator | 2026-01-06 01:45:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:35.425444 | orchestrator | 2026-01-06 01:45:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:35.427128 | orchestrator | 2026-01-06 01:45:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:35.427218 | orchestrator | 2026-01-06 01:45:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:38.470833 | orchestrator | 2026-01-06 01:45:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:38.472433 | orchestrator | 2026-01-06 01:45:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:38.472541 | orchestrator | 2026-01-06 01:45:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:41.515495 | orchestrator | 2026-01-06 01:45:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:41.516480 | orchestrator | 2026-01-06 01:45:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:41.516516 | orchestrator | 2026-01-06 01:45:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:44.561051 | orchestrator | 2026-01-06 01:45:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:44.562523 | orchestrator | 2026-01-06 01:45:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:44.562575 | orchestrator | 2026-01-06 01:45:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:47.609369 | orchestrator | 2026-01-06 01:45:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:47.611226 | orchestrator | 2026-01-06 01:45:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:47.611377 | orchestrator | 2026-01-06 01:45:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:50.656726 | orchestrator | 2026-01-06 01:45:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:50.658750 | orchestrator | 2026-01-06 01:45:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:50.658857 | orchestrator | 2026-01-06 01:45:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:53.696561 | orchestrator | 2026-01-06 01:45:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:53.697802 | orchestrator | 2026-01-06 01:45:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:53.697843 | orchestrator | 2026-01-06 01:45:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:56.749472 | orchestrator | 2026-01-06 01:45:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:56.751144 | orchestrator | 2026-01-06 01:45:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:56.751199 | orchestrator | 2026-01-06 01:45:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:45:59.803268 | orchestrator | 2026-01-06 01:45:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:45:59.805180 | orchestrator | 2026-01-06 01:45:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:45:59.805267 | orchestrator | 2026-01-06 01:45:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:02.856487 | orchestrator | 2026-01-06 01:46:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:02.859129 | orchestrator | 2026-01-06 01:46:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:02.859157 | orchestrator | 2026-01-06 01:46:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:05.909965 | orchestrator | 2026-01-06 01:46:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:05.912583 | orchestrator | 2026-01-06 01:46:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:05.912859 | orchestrator | 2026-01-06 01:46:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:08.963048 | orchestrator | 2026-01-06 01:46:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:08.964429 | orchestrator | 2026-01-06 01:46:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:08.964465 | orchestrator | 2026-01-06 01:46:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:12.008324 | orchestrator | 2026-01-06 01:46:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:12.009799 | orchestrator | 2026-01-06 01:46:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:12.009830 | orchestrator | 2026-01-06 01:46:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:15.054060 | orchestrator | 2026-01-06 01:46:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:15.056065 | orchestrator | 2026-01-06 01:46:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:15.056166 | orchestrator | 2026-01-06 01:46:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:18.094676 | orchestrator | 2026-01-06 01:46:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:18.095441 | orchestrator | 2026-01-06 01:46:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:18.095510 | orchestrator | 2026-01-06 01:46:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:21.141182 | orchestrator | 2026-01-06 01:46:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:21.142878 | orchestrator | 2026-01-06 01:46:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:21.142970 | orchestrator | 2026-01-06 01:46:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:24.192368 | orchestrator | 2026-01-06 01:46:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:24.193505 | orchestrator | 2026-01-06 01:46:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:24.193542 | orchestrator | 2026-01-06 01:46:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:27.243026 | orchestrator | 2026-01-06 01:46:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:27.244983 | orchestrator | 2026-01-06 01:46:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:27.245068 | orchestrator | 2026-01-06 01:46:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:30.282670 | orchestrator | 2026-01-06 01:46:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:30.283600 | orchestrator | 2026-01-06 01:46:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:30.283656 | orchestrator | 2026-01-06 01:46:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:33.324517 | orchestrator | 2026-01-06 01:46:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:33.325779 | orchestrator | 2026-01-06 01:46:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:33.325813 | orchestrator | 2026-01-06 01:46:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:36.370453 | orchestrator | 2026-01-06 01:46:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:36.372125 | orchestrator | 2026-01-06 01:46:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:36.372199 | orchestrator | 2026-01-06 01:46:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:39.425198 | orchestrator | 2026-01-06 01:46:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:39.426700 | orchestrator | 2026-01-06 01:46:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:39.426752 | orchestrator | 2026-01-06 01:46:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:42.470569 | orchestrator | 2026-01-06 01:46:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:42.471495 | orchestrator | 2026-01-06 01:46:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:42.471547 | orchestrator | 2026-01-06 01:46:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:45.516193 | orchestrator | 2026-01-06 01:46:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:45.517237 | orchestrator | 2026-01-06 01:46:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:45.517282 | orchestrator | 2026-01-06 01:46:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:48.569037 | orchestrator | 2026-01-06 01:46:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:48.570435 | orchestrator | 2026-01-06 01:46:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:48.570577 | orchestrator | 2026-01-06 01:46:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:51.615512 | orchestrator | 2026-01-06 01:46:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:51.617202 | orchestrator | 2026-01-06 01:46:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:51.617242 | orchestrator | 2026-01-06 01:46:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:54.664965 | orchestrator | 2026-01-06 01:46:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:54.665474 | orchestrator | 2026-01-06 01:46:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:54.665500 | orchestrator | 2026-01-06 01:46:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:46:57.711236 | orchestrator | 2026-01-06 01:46:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:46:57.713107 | orchestrator | 2026-01-06 01:46:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:46:57.713162 | orchestrator | 2026-01-06 01:46:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:00.762367 | orchestrator | 2026-01-06 01:47:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:00.764721 | orchestrator | 2026-01-06 01:47:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:00.764868 | orchestrator | 2026-01-06 01:47:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:03.815320 | orchestrator | 2026-01-06 01:47:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:03.817101 | orchestrator | 2026-01-06 01:47:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:03.817168 | orchestrator | 2026-01-06 01:47:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:06.864363 | orchestrator | 2026-01-06 01:47:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:06.865252 | orchestrator | 2026-01-06 01:47:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:06.865403 | orchestrator | 2026-01-06 01:47:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:09.920760 | orchestrator | 2026-01-06 01:47:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:09.921974 | orchestrator | 2026-01-06 01:47:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:09.922128 | orchestrator | 2026-01-06 01:47:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:12.970142 | orchestrator | 2026-01-06 01:47:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:12.970630 | orchestrator | 2026-01-06 01:47:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:12.970671 | orchestrator | 2026-01-06 01:47:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:16.017215 | orchestrator | 2026-01-06 01:47:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:16.019243 | orchestrator | 2026-01-06 01:47:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:16.019356 | orchestrator | 2026-01-06 01:47:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:19.060370 | orchestrator | 2026-01-06 01:47:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:19.062526 | orchestrator | 2026-01-06 01:47:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:19.062552 | orchestrator | 2026-01-06 01:47:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:22.109866 | orchestrator | 2026-01-06 01:47:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:22.111554 | orchestrator | 2026-01-06 01:47:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:22.111711 | orchestrator | 2026-01-06 01:47:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:25.154287 | orchestrator | 2026-01-06 01:47:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:25.157148 | orchestrator | 2026-01-06 01:47:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:25.157213 | orchestrator | 2026-01-06 01:47:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:28.197976 | orchestrator | 2026-01-06 01:47:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:28.198763 | orchestrator | 2026-01-06 01:47:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:28.198816 | orchestrator | 2026-01-06 01:47:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:31.247379 | orchestrator | 2026-01-06 01:47:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:31.248944 | orchestrator | 2026-01-06 01:47:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:31.249131 | orchestrator | 2026-01-06 01:47:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:34.295888 | orchestrator | 2026-01-06 01:47:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:34.297348 | orchestrator | 2026-01-06 01:47:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:34.297633 | orchestrator | 2026-01-06 01:47:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:37.342767 | orchestrator | 2026-01-06 01:47:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:37.343649 | orchestrator | 2026-01-06 01:47:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:37.343697 | orchestrator | 2026-01-06 01:47:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:40.389186 | orchestrator | 2026-01-06 01:47:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:40.389799 | orchestrator | 2026-01-06 01:47:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:40.389840 | orchestrator | 2026-01-06 01:47:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:43.436152 | orchestrator | 2026-01-06 01:47:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:43.438097 | orchestrator | 2026-01-06 01:47:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:43.438283 | orchestrator | 2026-01-06 01:47:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:46.487022 | orchestrator | 2026-01-06 01:47:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:46.489032 | orchestrator | 2026-01-06 01:47:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:46.489184 | orchestrator | 2026-01-06 01:47:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:49.539268 | orchestrator | 2026-01-06 01:47:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:49.541534 | orchestrator | 2026-01-06 01:47:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:49.541685 | orchestrator | 2026-01-06 01:47:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:52.594845 | orchestrator | 2026-01-06 01:47:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:52.596539 | orchestrator | 2026-01-06 01:47:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:52.596692 | orchestrator | 2026-01-06 01:47:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:55.645368 | orchestrator | 2026-01-06 01:47:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:55.646503 | orchestrator | 2026-01-06 01:47:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:55.646808 | orchestrator | 2026-01-06 01:47:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:47:58.689475 | orchestrator | 2026-01-06 01:47:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:47:58.691919 | orchestrator | 2026-01-06 01:47:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:47:58.692020 | orchestrator | 2026-01-06 01:47:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:01.732651 | orchestrator | 2026-01-06 01:48:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:01.734261 | orchestrator | 2026-01-06 01:48:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:01.734344 | orchestrator | 2026-01-06 01:48:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:04.779934 | orchestrator | 2026-01-06 01:48:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:04.781332 | orchestrator | 2026-01-06 01:48:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:04.781374 | orchestrator | 2026-01-06 01:48:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:07.835551 | orchestrator | 2026-01-06 01:48:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:07.838258 | orchestrator | 2026-01-06 01:48:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:07.838385 | orchestrator | 2026-01-06 01:48:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:10.890447 | orchestrator | 2026-01-06 01:48:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:10.892197 | orchestrator | 2026-01-06 01:48:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:10.892271 | orchestrator | 2026-01-06 01:48:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:13.948195 | orchestrator | 2026-01-06 01:48:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:13.950422 | orchestrator | 2026-01-06 01:48:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:13.950582 | orchestrator | 2026-01-06 01:48:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:16.994506 | orchestrator | 2026-01-06 01:48:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:16.995750 | orchestrator | 2026-01-06 01:48:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:16.996555 | orchestrator | 2026-01-06 01:48:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:20.041107 | orchestrator | 2026-01-06 01:48:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:20.042525 | orchestrator | 2026-01-06 01:48:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:20.042598 | orchestrator | 2026-01-06 01:48:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:23.087806 | orchestrator | 2026-01-06 01:48:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:23.089256 | orchestrator | 2026-01-06 01:48:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:23.089301 | orchestrator | 2026-01-06 01:48:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:26.138742 | orchestrator | 2026-01-06 01:48:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:26.140129 | orchestrator | 2026-01-06 01:48:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:26.140175 | orchestrator | 2026-01-06 01:48:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:29.190251 | orchestrator | 2026-01-06 01:48:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:29.191412 | orchestrator | 2026-01-06 01:48:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:29.191439 | orchestrator | 2026-01-06 01:48:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:32.234314 | orchestrator | 2026-01-06 01:48:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:32.236335 | orchestrator | 2026-01-06 01:48:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:32.236391 | orchestrator | 2026-01-06 01:48:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:35.281714 | orchestrator | 2026-01-06 01:48:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:35.282971 | orchestrator | 2026-01-06 01:48:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:35.283038 | orchestrator | 2026-01-06 01:48:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:38.325822 | orchestrator | 2026-01-06 01:48:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:38.327367 | orchestrator | 2026-01-06 01:48:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:38.327528 | orchestrator | 2026-01-06 01:48:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:41.374539 | orchestrator | 2026-01-06 01:48:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:41.376947 | orchestrator | 2026-01-06 01:48:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:41.377044 | orchestrator | 2026-01-06 01:48:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:44.421834 | orchestrator | 2026-01-06 01:48:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:44.424489 | orchestrator | 2026-01-06 01:48:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:44.424594 | orchestrator | 2026-01-06 01:48:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:47.472172 | orchestrator | 2026-01-06 01:48:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:47.475237 | orchestrator | 2026-01-06 01:48:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:47.475405 | orchestrator | 2026-01-06 01:48:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:50.529636 | orchestrator | 2026-01-06 01:48:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:50.531168 | orchestrator | 2026-01-06 01:48:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:50.531364 | orchestrator | 2026-01-06 01:48:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:53.581520 | orchestrator | 2026-01-06 01:48:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:53.583359 | orchestrator | 2026-01-06 01:48:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:53.583428 | orchestrator | 2026-01-06 01:48:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:56.633201 | orchestrator | 2026-01-06 01:48:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:56.634580 | orchestrator | 2026-01-06 01:48:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:56.635004 | orchestrator | 2026-01-06 01:48:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:48:59.676959 | orchestrator | 2026-01-06 01:48:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:48:59.678542 | orchestrator | 2026-01-06 01:48:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:48:59.678608 | orchestrator | 2026-01-06 01:48:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:02.725000 | orchestrator | 2026-01-06 01:49:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:02.727088 | orchestrator | 2026-01-06 01:49:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:02.727158 | orchestrator | 2026-01-06 01:49:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:05.774342 | orchestrator | 2026-01-06 01:49:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:05.776038 | orchestrator | 2026-01-06 01:49:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:05.776100 | orchestrator | 2026-01-06 01:49:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:08.823283 | orchestrator | 2026-01-06 01:49:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:08.824188 | orchestrator | 2026-01-06 01:49:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:08.824288 | orchestrator | 2026-01-06 01:49:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:11.869212 | orchestrator | 2026-01-06 01:49:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:11.870824 | orchestrator | 2026-01-06 01:49:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:11.870903 | orchestrator | 2026-01-06 01:49:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:14.918378 | orchestrator | 2026-01-06 01:49:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:14.921656 | orchestrator | 2026-01-06 01:49:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:14.922079 | orchestrator | 2026-01-06 01:49:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:17.973884 | orchestrator | 2026-01-06 01:49:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:17.975563 | orchestrator | 2026-01-06 01:49:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:17.975620 | orchestrator | 2026-01-06 01:49:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:21.027731 | orchestrator | 2026-01-06 01:49:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:21.028737 | orchestrator | 2026-01-06 01:49:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:21.028809 | orchestrator | 2026-01-06 01:49:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:24.077893 | orchestrator | 2026-01-06 01:49:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:24.077989 | orchestrator | 2026-01-06 01:49:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:24.078058 | orchestrator | 2026-01-06 01:49:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:27.107548 | orchestrator | 2026-01-06 01:49:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:27.110496 | orchestrator | 2026-01-06 01:49:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:27.110578 | orchestrator | 2026-01-06 01:49:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:30.161382 | orchestrator | 2026-01-06 01:49:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:30.163429 | orchestrator | 2026-01-06 01:49:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:30.163508 | orchestrator | 2026-01-06 01:49:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:33.211663 | orchestrator | 2026-01-06 01:49:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:33.212945 | orchestrator | 2026-01-06 01:49:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:33.213084 | orchestrator | 2026-01-06 01:49:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:36.258131 | orchestrator | 2026-01-06 01:49:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:36.261109 | orchestrator | 2026-01-06 01:49:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:36.261163 | orchestrator | 2026-01-06 01:49:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:39.308027 | orchestrator | 2026-01-06 01:49:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:39.309071 | orchestrator | 2026-01-06 01:49:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:39.309187 | orchestrator | 2026-01-06 01:49:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:42.355228 | orchestrator | 2026-01-06 01:49:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:42.356886 | orchestrator | 2026-01-06 01:49:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:42.357040 | orchestrator | 2026-01-06 01:49:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:45.405180 | orchestrator | 2026-01-06 01:49:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:45.407098 | orchestrator | 2026-01-06 01:49:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:45.407158 | orchestrator | 2026-01-06 01:49:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:48.456339 | orchestrator | 2026-01-06 01:49:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:48.457925 | orchestrator | 2026-01-06 01:49:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:48.458091 | orchestrator | 2026-01-06 01:49:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:51.507773 | orchestrator | 2026-01-06 01:49:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:51.510312 | orchestrator | 2026-01-06 01:49:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:51.510822 | orchestrator | 2026-01-06 01:49:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:54.557790 | orchestrator | 2026-01-06 01:49:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:54.558671 | orchestrator | 2026-01-06 01:49:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:54.558703 | orchestrator | 2026-01-06 01:49:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:49:57.601440 | orchestrator | 2026-01-06 01:49:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:49:57.603304 | orchestrator | 2026-01-06 01:49:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:49:57.603343 | orchestrator | 2026-01-06 01:49:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:00.654491 | orchestrator | 2026-01-06 01:50:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:00.656206 | orchestrator | 2026-01-06 01:50:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:00.656267 | orchestrator | 2026-01-06 01:50:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:03.704301 | orchestrator | 2026-01-06 01:50:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:03.707176 | orchestrator | 2026-01-06 01:50:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:03.707246 | orchestrator | 2026-01-06 01:50:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:06.749884 | orchestrator | 2026-01-06 01:50:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:06.751588 | orchestrator | 2026-01-06 01:50:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:06.751666 | orchestrator | 2026-01-06 01:50:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:09.797655 | orchestrator | 2026-01-06 01:50:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:09.798846 | orchestrator | 2026-01-06 01:50:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:09.799241 | orchestrator | 2026-01-06 01:50:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:12.849515 | orchestrator | 2026-01-06 01:50:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:12.850948 | orchestrator | 2026-01-06 01:50:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:12.851000 | orchestrator | 2026-01-06 01:50:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:15.904787 | orchestrator | 2026-01-06 01:50:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:15.906200 | orchestrator | 2026-01-06 01:50:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:15.906476 | orchestrator | 2026-01-06 01:50:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:18.955662 | orchestrator | 2026-01-06 01:50:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:18.959129 | orchestrator | 2026-01-06 01:50:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:18.959213 | orchestrator | 2026-01-06 01:50:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:22.013837 | orchestrator | 2026-01-06 01:50:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:22.018374 | orchestrator | 2026-01-06 01:50:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:22.018465 | orchestrator | 2026-01-06 01:50:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:25.068433 | orchestrator | 2026-01-06 01:50:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:25.070446 | orchestrator | 2026-01-06 01:50:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:25.070478 | orchestrator | 2026-01-06 01:50:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:28.114700 | orchestrator | 2026-01-06 01:50:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:28.116142 | orchestrator | 2026-01-06 01:50:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:28.116198 | orchestrator | 2026-01-06 01:50:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:31.166087 | orchestrator | 2026-01-06 01:50:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:31.168594 | orchestrator | 2026-01-06 01:50:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:31.168630 | orchestrator | 2026-01-06 01:50:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:34.219505 | orchestrator | 2026-01-06 01:50:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:34.222204 | orchestrator | 2026-01-06 01:50:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:34.222304 | orchestrator | 2026-01-06 01:50:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:37.269511 | orchestrator | 2026-01-06 01:50:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:37.271200 | orchestrator | 2026-01-06 01:50:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:37.271275 | orchestrator | 2026-01-06 01:50:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:40.318183 | orchestrator | 2026-01-06 01:50:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:40.319398 | orchestrator | 2026-01-06 01:50:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:40.319475 | orchestrator | 2026-01-06 01:50:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:43.364481 | orchestrator | 2026-01-06 01:50:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:43.366432 | orchestrator | 2026-01-06 01:50:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:43.367010 | orchestrator | 2026-01-06 01:50:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:46.411909 | orchestrator | 2026-01-06 01:50:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:46.413680 | orchestrator | 2026-01-06 01:50:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:46.413739 | orchestrator | 2026-01-06 01:50:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:49.462929 | orchestrator | 2026-01-06 01:50:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:49.464806 | orchestrator | 2026-01-06 01:50:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:49.465065 | orchestrator | 2026-01-06 01:50:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:52.512596 | orchestrator | 2026-01-06 01:50:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:52.513623 | orchestrator | 2026-01-06 01:50:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:52.513650 | orchestrator | 2026-01-06 01:50:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:55.563675 | orchestrator | 2026-01-06 01:50:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:55.566826 | orchestrator | 2026-01-06 01:50:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:55.566882 | orchestrator | 2026-01-06 01:50:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:50:58.610354 | orchestrator | 2026-01-06 01:50:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:50:58.611573 | orchestrator | 2026-01-06 01:50:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:50:58.611612 | orchestrator | 2026-01-06 01:50:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:01.655829 | orchestrator | 2026-01-06 01:51:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:01.658332 | orchestrator | 2026-01-06 01:51:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:01.658414 | orchestrator | 2026-01-06 01:51:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:04.709902 | orchestrator | 2026-01-06 01:51:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:04.713015 | orchestrator | 2026-01-06 01:51:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:04.713108 | orchestrator | 2026-01-06 01:51:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:07.757912 | orchestrator | 2026-01-06 01:51:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:07.758313 | orchestrator | 2026-01-06 01:51:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:07.758339 | orchestrator | 2026-01-06 01:51:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:10.803031 | orchestrator | 2026-01-06 01:51:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:10.803854 | orchestrator | 2026-01-06 01:51:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:10.803926 | orchestrator | 2026-01-06 01:51:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:13.846487 | orchestrator | 2026-01-06 01:51:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:13.848367 | orchestrator | 2026-01-06 01:51:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:13.848419 | orchestrator | 2026-01-06 01:51:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:16.890727 | orchestrator | 2026-01-06 01:51:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:16.893116 | orchestrator | 2026-01-06 01:51:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:16.893189 | orchestrator | 2026-01-06 01:51:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:19.935933 | orchestrator | 2026-01-06 01:51:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:19.937081 | orchestrator | 2026-01-06 01:51:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:19.937143 | orchestrator | 2026-01-06 01:51:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:22.976329 | orchestrator | 2026-01-06 01:51:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:22.977186 | orchestrator | 2026-01-06 01:51:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:22.977229 | orchestrator | 2026-01-06 01:51:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:26.026754 | orchestrator | 2026-01-06 01:51:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:26.028323 | orchestrator | 2026-01-06 01:51:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:26.028371 | orchestrator | 2026-01-06 01:51:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:29.069907 | orchestrator | 2026-01-06 01:51:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:29.071633 | orchestrator | 2026-01-06 01:51:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:29.071703 | orchestrator | 2026-01-06 01:51:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:32.120746 | orchestrator | 2026-01-06 01:51:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:32.122332 | orchestrator | 2026-01-06 01:51:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:32.122372 | orchestrator | 2026-01-06 01:51:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:35.160582 | orchestrator | 2026-01-06 01:51:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:35.163720 | orchestrator | 2026-01-06 01:51:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:35.163831 | orchestrator | 2026-01-06 01:51:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:38.211754 | orchestrator | 2026-01-06 01:51:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:38.216263 | orchestrator | 2026-01-06 01:51:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:38.216330 | orchestrator | 2026-01-06 01:51:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:41.264092 | orchestrator | 2026-01-06 01:51:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:41.264514 | orchestrator | 2026-01-06 01:51:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:41.264544 | orchestrator | 2026-01-06 01:51:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:44.305663 | orchestrator | 2026-01-06 01:51:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:44.308459 | orchestrator | 2026-01-06 01:51:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:44.308521 | orchestrator | 2026-01-06 01:51:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:47.357637 | orchestrator | 2026-01-06 01:51:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:47.360132 | orchestrator | 2026-01-06 01:51:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:47.397419 | orchestrator | 2026-01-06 01:51:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:50.409977 | orchestrator | 2026-01-06 01:51:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:50.412313 | orchestrator | 2026-01-06 01:51:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:50.412390 | orchestrator | 2026-01-06 01:51:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:53.463596 | orchestrator | 2026-01-06 01:51:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:53.465514 | orchestrator | 2026-01-06 01:51:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:53.465674 | orchestrator | 2026-01-06 01:51:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:56.505774 | orchestrator | 2026-01-06 01:51:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:56.506272 | orchestrator | 2026-01-06 01:51:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:56.506317 | orchestrator | 2026-01-06 01:51:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:51:59.551850 | orchestrator | 2026-01-06 01:51:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:51:59.552579 | orchestrator | 2026-01-06 01:51:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:51:59.552771 | orchestrator | 2026-01-06 01:51:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:02.600143 | orchestrator | 2026-01-06 01:52:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:02.601695 | orchestrator | 2026-01-06 01:52:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:02.601778 | orchestrator | 2026-01-06 01:52:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:05.652941 | orchestrator | 2026-01-06 01:52:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:05.653838 | orchestrator | 2026-01-06 01:52:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:05.653877 | orchestrator | 2026-01-06 01:52:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:08.700063 | orchestrator | 2026-01-06 01:52:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:08.701630 | orchestrator | 2026-01-06 01:52:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:08.701729 | orchestrator | 2026-01-06 01:52:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:11.750607 | orchestrator | 2026-01-06 01:52:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:11.752447 | orchestrator | 2026-01-06 01:52:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:11.752771 | orchestrator | 2026-01-06 01:52:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:14.820886 | orchestrator | 2026-01-06 01:52:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:14.823251 | orchestrator | 2026-01-06 01:52:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:14.823551 | orchestrator | 2026-01-06 01:52:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:17.869950 | orchestrator | 2026-01-06 01:52:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:17.872410 | orchestrator | 2026-01-06 01:52:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:17.872466 | orchestrator | 2026-01-06 01:52:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:20.926589 | orchestrator | 2026-01-06 01:52:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:20.928218 | orchestrator | 2026-01-06 01:52:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:20.928283 | orchestrator | 2026-01-06 01:52:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:23.977023 | orchestrator | 2026-01-06 01:52:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:23.980018 | orchestrator | 2026-01-06 01:52:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:23.980095 | orchestrator | 2026-01-06 01:52:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:27.029986 | orchestrator | 2026-01-06 01:52:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:27.031801 | orchestrator | 2026-01-06 01:52:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:27.031997 | orchestrator | 2026-01-06 01:52:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:30.068686 | orchestrator | 2026-01-06 01:52:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:30.070460 | orchestrator | 2026-01-06 01:52:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:30.070538 | orchestrator | 2026-01-06 01:52:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:33.117814 | orchestrator | 2026-01-06 01:52:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:33.119888 | orchestrator | 2026-01-06 01:52:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:33.119945 | orchestrator | 2026-01-06 01:52:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:36.171257 | orchestrator | 2026-01-06 01:52:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:36.172876 | orchestrator | 2026-01-06 01:52:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:36.172932 | orchestrator | 2026-01-06 01:52:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:39.222714 | orchestrator | 2026-01-06 01:52:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:39.223905 | orchestrator | 2026-01-06 01:52:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:39.223959 | orchestrator | 2026-01-06 01:52:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:42.276230 | orchestrator | 2026-01-06 01:52:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:42.277617 | orchestrator | 2026-01-06 01:52:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:42.277704 | orchestrator | 2026-01-06 01:52:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:45.329154 | orchestrator | 2026-01-06 01:52:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:45.331909 | orchestrator | 2026-01-06 01:52:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:45.331943 | orchestrator | 2026-01-06 01:52:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:48.380509 | orchestrator | 2026-01-06 01:52:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:48.382092 | orchestrator | 2026-01-06 01:52:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:48.382170 | orchestrator | 2026-01-06 01:52:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:51.428608 | orchestrator | 2026-01-06 01:52:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:51.429865 | orchestrator | 2026-01-06 01:52:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:51.430374 | orchestrator | 2026-01-06 01:52:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:54.476603 | orchestrator | 2026-01-06 01:52:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:54.477867 | orchestrator | 2026-01-06 01:52:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:54.477900 | orchestrator | 2026-01-06 01:52:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:52:57.524790 | orchestrator | 2026-01-06 01:52:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:52:57.526519 | orchestrator | 2026-01-06 01:52:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:52:57.526572 | orchestrator | 2026-01-06 01:52:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:00.568495 | orchestrator | 2026-01-06 01:53:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:00.568637 | orchestrator | 2026-01-06 01:53:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:00.568651 | orchestrator | 2026-01-06 01:53:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:03.617929 | orchestrator | 2026-01-06 01:53:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:03.619371 | orchestrator | 2026-01-06 01:53:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:03.619435 | orchestrator | 2026-01-06 01:53:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:06.669311 | orchestrator | 2026-01-06 01:53:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:06.670953 | orchestrator | 2026-01-06 01:53:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:06.671052 | orchestrator | 2026-01-06 01:53:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:09.719509 | orchestrator | 2026-01-06 01:53:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:09.721119 | orchestrator | 2026-01-06 01:53:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:09.721635 | orchestrator | 2026-01-06 01:53:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:12.771697 | orchestrator | 2026-01-06 01:53:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:12.774108 | orchestrator | 2026-01-06 01:53:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:12.774199 | orchestrator | 2026-01-06 01:53:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:15.819436 | orchestrator | 2026-01-06 01:53:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:15.822004 | orchestrator | 2026-01-06 01:53:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:15.822136 | orchestrator | 2026-01-06 01:53:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:18.867761 | orchestrator | 2026-01-06 01:53:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:18.870394 | orchestrator | 2026-01-06 01:53:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:18.870511 | orchestrator | 2026-01-06 01:53:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:21.918562 | orchestrator | 2026-01-06 01:53:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:21.919813 | orchestrator | 2026-01-06 01:53:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:21.920064 | orchestrator | 2026-01-06 01:53:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:24.959316 | orchestrator | 2026-01-06 01:53:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:24.962396 | orchestrator | 2026-01-06 01:53:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:24.962602 | orchestrator | 2026-01-06 01:53:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:28.011989 | orchestrator | 2026-01-06 01:53:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:28.013017 | orchestrator | 2026-01-06 01:53:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:28.013040 | orchestrator | 2026-01-06 01:53:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:31.054523 | orchestrator | 2026-01-06 01:53:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:31.056095 | orchestrator | 2026-01-06 01:53:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:31.056193 | orchestrator | 2026-01-06 01:53:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:34.100679 | orchestrator | 2026-01-06 01:53:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:34.102757 | orchestrator | 2026-01-06 01:53:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:34.102866 | orchestrator | 2026-01-06 01:53:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:37.156792 | orchestrator | 2026-01-06 01:53:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:37.158608 | orchestrator | 2026-01-06 01:53:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:37.158676 | orchestrator | 2026-01-06 01:53:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:40.208043 | orchestrator | 2026-01-06 01:53:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:40.209715 | orchestrator | 2026-01-06 01:53:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:40.209787 | orchestrator | 2026-01-06 01:53:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:43.256445 | orchestrator | 2026-01-06 01:53:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:43.257573 | orchestrator | 2026-01-06 01:53:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:43.257716 | orchestrator | 2026-01-06 01:53:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:46.304627 | orchestrator | 2026-01-06 01:53:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:46.306251 | orchestrator | 2026-01-06 01:53:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:46.306270 | orchestrator | 2026-01-06 01:53:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:49.353984 | orchestrator | 2026-01-06 01:53:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:49.355899 | orchestrator | 2026-01-06 01:53:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:49.356017 | orchestrator | 2026-01-06 01:53:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:52.407211 | orchestrator | 2026-01-06 01:53:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:52.409580 | orchestrator | 2026-01-06 01:53:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:52.409636 | orchestrator | 2026-01-06 01:53:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:55.453977 | orchestrator | 2026-01-06 01:53:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:55.455690 | orchestrator | 2026-01-06 01:53:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:55.455711 | orchestrator | 2026-01-06 01:53:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:53:58.502851 | orchestrator | 2026-01-06 01:53:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:53:58.503347 | orchestrator | 2026-01-06 01:53:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:53:58.503565 | orchestrator | 2026-01-06 01:53:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:01.556869 | orchestrator | 2026-01-06 01:54:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:01.558417 | orchestrator | 2026-01-06 01:54:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:01.558458 | orchestrator | 2026-01-06 01:54:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:04.607789 | orchestrator | 2026-01-06 01:54:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:04.610692 | orchestrator | 2026-01-06 01:54:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:04.610797 | orchestrator | 2026-01-06 01:54:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:07.652472 | orchestrator | 2026-01-06 01:54:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:07.655203 | orchestrator | 2026-01-06 01:54:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:07.655332 | orchestrator | 2026-01-06 01:54:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:10.697801 | orchestrator | 2026-01-06 01:54:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:10.700414 | orchestrator | 2026-01-06 01:54:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:10.700445 | orchestrator | 2026-01-06 01:54:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:13.747447 | orchestrator | 2026-01-06 01:54:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:13.749055 | orchestrator | 2026-01-06 01:54:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:13.749102 | orchestrator | 2026-01-06 01:54:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:16.797619 | orchestrator | 2026-01-06 01:54:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:16.801298 | orchestrator | 2026-01-06 01:54:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:16.801366 | orchestrator | 2026-01-06 01:54:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:19.848269 | orchestrator | 2026-01-06 01:54:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:19.849498 | orchestrator | 2026-01-06 01:54:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:19.849615 | orchestrator | 2026-01-06 01:54:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:22.895154 | orchestrator | 2026-01-06 01:54:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:22.898709 | orchestrator | 2026-01-06 01:54:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:22.898786 | orchestrator | 2026-01-06 01:54:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:25.951633 | orchestrator | 2026-01-06 01:54:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:25.953928 | orchestrator | 2026-01-06 01:54:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:25.953997 | orchestrator | 2026-01-06 01:54:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:28.999249 | orchestrator | 2026-01-06 01:54:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:29.000591 | orchestrator | 2026-01-06 01:54:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:29.000625 | orchestrator | 2026-01-06 01:54:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:32.051481 | orchestrator | 2026-01-06 01:54:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:32.052821 | orchestrator | 2026-01-06 01:54:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:32.052869 | orchestrator | 2026-01-06 01:54:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:35.098294 | orchestrator | 2026-01-06 01:54:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:35.099709 | orchestrator | 2026-01-06 01:54:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:35.099767 | orchestrator | 2026-01-06 01:54:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:38.148553 | orchestrator | 2026-01-06 01:54:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:38.151183 | orchestrator | 2026-01-06 01:54:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:38.151289 | orchestrator | 2026-01-06 01:54:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:41.201811 | orchestrator | 2026-01-06 01:54:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:41.204348 | orchestrator | 2026-01-06 01:54:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:41.204634 | orchestrator | 2026-01-06 01:54:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:44.250630 | orchestrator | 2026-01-06 01:54:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:44.252367 | orchestrator | 2026-01-06 01:54:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:44.252827 | orchestrator | 2026-01-06 01:54:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:47.306989 | orchestrator | 2026-01-06 01:54:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:47.311218 | orchestrator | 2026-01-06 01:54:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:47.311299 | orchestrator | 2026-01-06 01:54:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:50.356202 | orchestrator | 2026-01-06 01:54:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:50.357920 | orchestrator | 2026-01-06 01:54:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:50.357973 | orchestrator | 2026-01-06 01:54:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:53.403171 | orchestrator | 2026-01-06 01:54:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:53.405432 | orchestrator | 2026-01-06 01:54:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:53.405505 | orchestrator | 2026-01-06 01:54:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:56.450377 | orchestrator | 2026-01-06 01:54:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:56.452034 | orchestrator | 2026-01-06 01:54:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:56.452077 | orchestrator | 2026-01-06 01:54:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:54:59.503205 | orchestrator | 2026-01-06 01:54:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:54:59.505221 | orchestrator | 2026-01-06 01:54:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:54:59.505281 | orchestrator | 2026-01-06 01:54:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:02.555876 | orchestrator | 2026-01-06 01:55:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:02.557608 | orchestrator | 2026-01-06 01:55:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:02.557842 | orchestrator | 2026-01-06 01:55:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:05.603796 | orchestrator | 2026-01-06 01:55:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:05.605553 | orchestrator | 2026-01-06 01:55:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:05.605639 | orchestrator | 2026-01-06 01:55:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:08.653499 | orchestrator | 2026-01-06 01:55:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:08.656247 | orchestrator | 2026-01-06 01:55:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:08.656330 | orchestrator | 2026-01-06 01:55:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:11.695861 | orchestrator | 2026-01-06 01:55:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:11.697373 | orchestrator | 2026-01-06 01:55:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:11.697435 | orchestrator | 2026-01-06 01:55:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:14.739603 | orchestrator | 2026-01-06 01:55:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:14.741709 | orchestrator | 2026-01-06 01:55:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:14.741766 | orchestrator | 2026-01-06 01:55:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:17.797892 | orchestrator | 2026-01-06 01:55:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:17.799864 | orchestrator | 2026-01-06 01:55:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:17.799921 | orchestrator | 2026-01-06 01:55:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:20.848520 | orchestrator | 2026-01-06 01:55:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:20.852011 | orchestrator | 2026-01-06 01:55:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:20.852080 | orchestrator | 2026-01-06 01:55:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:23.905184 | orchestrator | 2026-01-06 01:55:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:23.909528 | orchestrator | 2026-01-06 01:55:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:23.909633 | orchestrator | 2026-01-06 01:55:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:26.957532 | orchestrator | 2026-01-06 01:55:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:26.958754 | orchestrator | 2026-01-06 01:55:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:26.958826 | orchestrator | 2026-01-06 01:55:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:30.005423 | orchestrator | 2026-01-06 01:55:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:30.006852 | orchestrator | 2026-01-06 01:55:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:30.006904 | orchestrator | 2026-01-06 01:55:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:33.052237 | orchestrator | 2026-01-06 01:55:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:33.054222 | orchestrator | 2026-01-06 01:55:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:33.054299 | orchestrator | 2026-01-06 01:55:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:36.105906 | orchestrator | 2026-01-06 01:55:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:36.107785 | orchestrator | 2026-01-06 01:55:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:36.108033 | orchestrator | 2026-01-06 01:55:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:39.152940 | orchestrator | 2026-01-06 01:55:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:39.154694 | orchestrator | 2026-01-06 01:55:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:39.154746 | orchestrator | 2026-01-06 01:55:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:42.204708 | orchestrator | 2026-01-06 01:55:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:42.206739 | orchestrator | 2026-01-06 01:55:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:42.206809 | orchestrator | 2026-01-06 01:55:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:45.254973 | orchestrator | 2026-01-06 01:55:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:45.256991 | orchestrator | 2026-01-06 01:55:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:45.257675 | orchestrator | 2026-01-06 01:55:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:48.305535 | orchestrator | 2026-01-06 01:55:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:48.307213 | orchestrator | 2026-01-06 01:55:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:48.307285 | orchestrator | 2026-01-06 01:55:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:51.352636 | orchestrator | 2026-01-06 01:55:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:51.353985 | orchestrator | 2026-01-06 01:55:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:51.354122 | orchestrator | 2026-01-06 01:55:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:54.402719 | orchestrator | 2026-01-06 01:55:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:54.404597 | orchestrator | 2026-01-06 01:55:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:54.404674 | orchestrator | 2026-01-06 01:55:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:55:57.453775 | orchestrator | 2026-01-06 01:55:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:55:57.455601 | orchestrator | 2026-01-06 01:55:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:55:57.455660 | orchestrator | 2026-01-06 01:55:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:00.500563 | orchestrator | 2026-01-06 01:56:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:00.501760 | orchestrator | 2026-01-06 01:56:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:00.501792 | orchestrator | 2026-01-06 01:56:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:03.554292 | orchestrator | 2026-01-06 01:56:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:03.556073 | orchestrator | 2026-01-06 01:56:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:03.556118 | orchestrator | 2026-01-06 01:56:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:06.597007 | orchestrator | 2026-01-06 01:56:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:06.598878 | orchestrator | 2026-01-06 01:56:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:06.599025 | orchestrator | 2026-01-06 01:56:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:09.643956 | orchestrator | 2026-01-06 01:56:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:09.645743 | orchestrator | 2026-01-06 01:56:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:09.645812 | orchestrator | 2026-01-06 01:56:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:12.699361 | orchestrator | 2026-01-06 01:56:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:12.702007 | orchestrator | 2026-01-06 01:56:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:12.702184 | orchestrator | 2026-01-06 01:56:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:15.752600 | orchestrator | 2026-01-06 01:56:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:15.756237 | orchestrator | 2026-01-06 01:56:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:15.756325 | orchestrator | 2026-01-06 01:56:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:18.807758 | orchestrator | 2026-01-06 01:56:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:18.811530 | orchestrator | 2026-01-06 01:56:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:18.811690 | orchestrator | 2026-01-06 01:56:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:21.862278 | orchestrator | 2026-01-06 01:56:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:21.863636 | orchestrator | 2026-01-06 01:56:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:21.863688 | orchestrator | 2026-01-06 01:56:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:24.914619 | orchestrator | 2026-01-06 01:56:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:24.917359 | orchestrator | 2026-01-06 01:56:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:24.917417 | orchestrator | 2026-01-06 01:56:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:27.967992 | orchestrator | 2026-01-06 01:56:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:27.969040 | orchestrator | 2026-01-06 01:56:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:27.969086 | orchestrator | 2026-01-06 01:56:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:31.012251 | orchestrator | 2026-01-06 01:56:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:31.012667 | orchestrator | 2026-01-06 01:56:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:31.012713 | orchestrator | 2026-01-06 01:56:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:34.061906 | orchestrator | 2026-01-06 01:56:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:34.063106 | orchestrator | 2026-01-06 01:56:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:34.063172 | orchestrator | 2026-01-06 01:56:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:37.111325 | orchestrator | 2026-01-06 01:56:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:37.113245 | orchestrator | 2026-01-06 01:56:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:37.113325 | orchestrator | 2026-01-06 01:56:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:40.156212 | orchestrator | 2026-01-06 01:56:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:40.158770 | orchestrator | 2026-01-06 01:56:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:40.158832 | orchestrator | 2026-01-06 01:56:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:43.205594 | orchestrator | 2026-01-06 01:56:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:43.206759 | orchestrator | 2026-01-06 01:56:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:43.206786 | orchestrator | 2026-01-06 01:56:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:46.253375 | orchestrator | 2026-01-06 01:56:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:46.255367 | orchestrator | 2026-01-06 01:56:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:46.255443 | orchestrator | 2026-01-06 01:56:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:49.299095 | orchestrator | 2026-01-06 01:56:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:49.301264 | orchestrator | 2026-01-06 01:56:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:49.301400 | orchestrator | 2026-01-06 01:56:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:52.343888 | orchestrator | 2026-01-06 01:56:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:52.345119 | orchestrator | 2026-01-06 01:56:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:52.345160 | orchestrator | 2026-01-06 01:56:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:55.389363 | orchestrator | 2026-01-06 01:56:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:55.390567 | orchestrator | 2026-01-06 01:56:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:55.390615 | orchestrator | 2026-01-06 01:56:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:56:58.441275 | orchestrator | 2026-01-06 01:56:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:56:58.442611 | orchestrator | 2026-01-06 01:56:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:56:58.442642 | orchestrator | 2026-01-06 01:56:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:01.497855 | orchestrator | 2026-01-06 01:57:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:01.500197 | orchestrator | 2026-01-06 01:57:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:01.500315 | orchestrator | 2026-01-06 01:57:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:04.548646 | orchestrator | 2026-01-06 01:57:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:04.550262 | orchestrator | 2026-01-06 01:57:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:04.551779 | orchestrator | 2026-01-06 01:57:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:07.596322 | orchestrator | 2026-01-06 01:57:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:07.598812 | orchestrator | 2026-01-06 01:57:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:07.598863 | orchestrator | 2026-01-06 01:57:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:10.644603 | orchestrator | 2026-01-06 01:57:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:10.645832 | orchestrator | 2026-01-06 01:57:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:10.645872 | orchestrator | 2026-01-06 01:57:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:13.689401 | orchestrator | 2026-01-06 01:57:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:13.690114 | orchestrator | 2026-01-06 01:57:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:13.690593 | orchestrator | 2026-01-06 01:57:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:16.734266 | orchestrator | 2026-01-06 01:57:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:16.736321 | orchestrator | 2026-01-06 01:57:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:16.736395 | orchestrator | 2026-01-06 01:57:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:19.777609 | orchestrator | 2026-01-06 01:57:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:19.779211 | orchestrator | 2026-01-06 01:57:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:19.779266 | orchestrator | 2026-01-06 01:57:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:22.827313 | orchestrator | 2026-01-06 01:57:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:22.828787 | orchestrator | 2026-01-06 01:57:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:22.828903 | orchestrator | 2026-01-06 01:57:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:25.874406 | orchestrator | 2026-01-06 01:57:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:25.876173 | orchestrator | 2026-01-06 01:57:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:25.876224 | orchestrator | 2026-01-06 01:57:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:28.921480 | orchestrator | 2026-01-06 01:57:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:28.923241 | orchestrator | 2026-01-06 01:57:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:28.923294 | orchestrator | 2026-01-06 01:57:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:31.973655 | orchestrator | 2026-01-06 01:57:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:31.975578 | orchestrator | 2026-01-06 01:57:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:31.975662 | orchestrator | 2026-01-06 01:57:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:35.035986 | orchestrator | 2026-01-06 01:57:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:35.039334 | orchestrator | 2026-01-06 01:57:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:35.039419 | orchestrator | 2026-01-06 01:57:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:38.086155 | orchestrator | 2026-01-06 01:57:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:38.088836 | orchestrator | 2026-01-06 01:57:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:38.088930 | orchestrator | 2026-01-06 01:57:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:41.140979 | orchestrator | 2026-01-06 01:57:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:41.141460 | orchestrator | 2026-01-06 01:57:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:41.141474 | orchestrator | 2026-01-06 01:57:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:44.187148 | orchestrator | 2026-01-06 01:57:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:44.189303 | orchestrator | 2026-01-06 01:57:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:44.189494 | orchestrator | 2026-01-06 01:57:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:47.236904 | orchestrator | 2026-01-06 01:57:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:47.238572 | orchestrator | 2026-01-06 01:57:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:47.238605 | orchestrator | 2026-01-06 01:57:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:50.278596 | orchestrator | 2026-01-06 01:57:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:50.279224 | orchestrator | 2026-01-06 01:57:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:50.279259 | orchestrator | 2026-01-06 01:57:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:53.330262 | orchestrator | 2026-01-06 01:57:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:53.331624 | orchestrator | 2026-01-06 01:57:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:53.331754 | orchestrator | 2026-01-06 01:57:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:56.376792 | orchestrator | 2026-01-06 01:57:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:56.379172 | orchestrator | 2026-01-06 01:57:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:56.379242 | orchestrator | 2026-01-06 01:57:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:57:59.435280 | orchestrator | 2026-01-06 01:57:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:57:59.436973 | orchestrator | 2026-01-06 01:57:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:57:59.437058 | orchestrator | 2026-01-06 01:57:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:02.480220 | orchestrator | 2026-01-06 01:58:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:02.482794 | orchestrator | 2026-01-06 01:58:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:02.483063 | orchestrator | 2026-01-06 01:58:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:05.531325 | orchestrator | 2026-01-06 01:58:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:05.533608 | orchestrator | 2026-01-06 01:58:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:05.533783 | orchestrator | 2026-01-06 01:58:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:08.584375 | orchestrator | 2026-01-06 01:58:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:08.585981 | orchestrator | 2026-01-06 01:58:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:08.586086 | orchestrator | 2026-01-06 01:58:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:11.631446 | orchestrator | 2026-01-06 01:58:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:11.632234 | orchestrator | 2026-01-06 01:58:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:11.632275 | orchestrator | 2026-01-06 01:58:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:14.682245 | orchestrator | 2026-01-06 01:58:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:14.684013 | orchestrator | 2026-01-06 01:58:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:14.684078 | orchestrator | 2026-01-06 01:58:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:17.728683 | orchestrator | 2026-01-06 01:58:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:17.730964 | orchestrator | 2026-01-06 01:58:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:17.731042 | orchestrator | 2026-01-06 01:58:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:20.777101 | orchestrator | 2026-01-06 01:58:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:20.779531 | orchestrator | 2026-01-06 01:58:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:20.779750 | orchestrator | 2026-01-06 01:58:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:23.826406 | orchestrator | 2026-01-06 01:58:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:23.828217 | orchestrator | 2026-01-06 01:58:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:23.828316 | orchestrator | 2026-01-06 01:58:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:26.875672 | orchestrator | 2026-01-06 01:58:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:26.880039 | orchestrator | 2026-01-06 01:58:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:26.880482 | orchestrator | 2026-01-06 01:58:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:29.932050 | orchestrator | 2026-01-06 01:58:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:29.934079 | orchestrator | 2026-01-06 01:58:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:29.934111 | orchestrator | 2026-01-06 01:58:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:32.984493 | orchestrator | 2026-01-06 01:58:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:32.986509 | orchestrator | 2026-01-06 01:58:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:32.986685 | orchestrator | 2026-01-06 01:58:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:36.036396 | orchestrator | 2026-01-06 01:58:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:36.038311 | orchestrator | 2026-01-06 01:58:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:36.039490 | orchestrator | 2026-01-06 01:58:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:39.079495 | orchestrator | 2026-01-06 01:58:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:39.080186 | orchestrator | 2026-01-06 01:58:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:39.080207 | orchestrator | 2026-01-06 01:58:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:42.122375 | orchestrator | 2026-01-06 01:58:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:42.125180 | orchestrator | 2026-01-06 01:58:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:42.125272 | orchestrator | 2026-01-06 01:58:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:45.168134 | orchestrator | 2026-01-06 01:58:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:45.170545 | orchestrator | 2026-01-06 01:58:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:45.170677 | orchestrator | 2026-01-06 01:58:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:48.212652 | orchestrator | 2026-01-06 01:58:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:48.214408 | orchestrator | 2026-01-06 01:58:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:48.214460 | orchestrator | 2026-01-06 01:58:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:51.272768 | orchestrator | 2026-01-06 01:58:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:51.275752 | orchestrator | 2026-01-06 01:58:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:51.275822 | orchestrator | 2026-01-06 01:58:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:54.323348 | orchestrator | 2026-01-06 01:58:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:54.324892 | orchestrator | 2026-01-06 01:58:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:54.324945 | orchestrator | 2026-01-06 01:58:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:58:57.371091 | orchestrator | 2026-01-06 01:58:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:58:57.372654 | orchestrator | 2026-01-06 01:58:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:58:57.373124 | orchestrator | 2026-01-06 01:58:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:00.423061 | orchestrator | 2026-01-06 01:59:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:00.424799 | orchestrator | 2026-01-06 01:59:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:00.424849 | orchestrator | 2026-01-06 01:59:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:03.476408 | orchestrator | 2026-01-06 01:59:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:03.478325 | orchestrator | 2026-01-06 01:59:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:03.478385 | orchestrator | 2026-01-06 01:59:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:06.517373 | orchestrator | 2026-01-06 01:59:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:06.517994 | orchestrator | 2026-01-06 01:59:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:06.518067 | orchestrator | 2026-01-06 01:59:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:09.565890 | orchestrator | 2026-01-06 01:59:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:09.567547 | orchestrator | 2026-01-06 01:59:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:09.568206 | orchestrator | 2026-01-06 01:59:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:12.617074 | orchestrator | 2026-01-06 01:59:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:12.618228 | orchestrator | 2026-01-06 01:59:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:12.618359 | orchestrator | 2026-01-06 01:59:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:15.665350 | orchestrator | 2026-01-06 01:59:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:15.667138 | orchestrator | 2026-01-06 01:59:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:15.667193 | orchestrator | 2026-01-06 01:59:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:18.709971 | orchestrator | 2026-01-06 01:59:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:18.712670 | orchestrator | 2026-01-06 01:59:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:18.712763 | orchestrator | 2026-01-06 01:59:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:21.758983 | orchestrator | 2026-01-06 01:59:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:21.761865 | orchestrator | 2026-01-06 01:59:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:21.761927 | orchestrator | 2026-01-06 01:59:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:24.802660 | orchestrator | 2026-01-06 01:59:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:24.805150 | orchestrator | 2026-01-06 01:59:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:24.805945 | orchestrator | 2026-01-06 01:59:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:27.852532 | orchestrator | 2026-01-06 01:59:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:27.853449 | orchestrator | 2026-01-06 01:59:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:27.853486 | orchestrator | 2026-01-06 01:59:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:30.899254 | orchestrator | 2026-01-06 01:59:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:30.902695 | orchestrator | 2026-01-06 01:59:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:30.902768 | orchestrator | 2026-01-06 01:59:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:33.953871 | orchestrator | 2026-01-06 01:59:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:33.955502 | orchestrator | 2026-01-06 01:59:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:33.955537 | orchestrator | 2026-01-06 01:59:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:37.007783 | orchestrator | 2026-01-06 01:59:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:37.009268 | orchestrator | 2026-01-06 01:59:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:37.009316 | orchestrator | 2026-01-06 01:59:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:40.055699 | orchestrator | 2026-01-06 01:59:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:40.057106 | orchestrator | 2026-01-06 01:59:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:40.057275 | orchestrator | 2026-01-06 01:59:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:43.108355 | orchestrator | 2026-01-06 01:59:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:43.111077 | orchestrator | 2026-01-06 01:59:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:43.111165 | orchestrator | 2026-01-06 01:59:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:46.152734 | orchestrator | 2026-01-06 01:59:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:46.154278 | orchestrator | 2026-01-06 01:59:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:46.155062 | orchestrator | 2026-01-06 01:59:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:49.202063 | orchestrator | 2026-01-06 01:59:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:49.203929 | orchestrator | 2026-01-06 01:59:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:49.204018 | orchestrator | 2026-01-06 01:59:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:52.254221 | orchestrator | 2026-01-06 01:59:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:52.256838 | orchestrator | 2026-01-06 01:59:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:52.257122 | orchestrator | 2026-01-06 01:59:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:55.302415 | orchestrator | 2026-01-06 01:59:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:55.303415 | orchestrator | 2026-01-06 01:59:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:55.303452 | orchestrator | 2026-01-06 01:59:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 01:59:58.353200 | orchestrator | 2026-01-06 01:59:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 01:59:58.354511 | orchestrator | 2026-01-06 01:59:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 01:59:58.354681 | orchestrator | 2026-01-06 01:59:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:01.399820 | orchestrator | 2026-01-06 02:00:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:01.402110 | orchestrator | 2026-01-06 02:00:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:01.402238 | orchestrator | 2026-01-06 02:00:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:04.451956 | orchestrator | 2026-01-06 02:00:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:04.453341 | orchestrator | 2026-01-06 02:00:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:04.453423 | orchestrator | 2026-01-06 02:00:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:07.497975 | orchestrator | 2026-01-06 02:00:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:07.499251 | orchestrator | 2026-01-06 02:00:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:07.499762 | orchestrator | 2026-01-06 02:00:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:10.548536 | orchestrator | 2026-01-06 02:00:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:10.550596 | orchestrator | 2026-01-06 02:00:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:10.550705 | orchestrator | 2026-01-06 02:00:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:13.591566 | orchestrator | 2026-01-06 02:00:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:13.593761 | orchestrator | 2026-01-06 02:00:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:13.594396 | orchestrator | 2026-01-06 02:00:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:16.633719 | orchestrator | 2026-01-06 02:00:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:16.636566 | orchestrator | 2026-01-06 02:00:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:16.636643 | orchestrator | 2026-01-06 02:00:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:19.687719 | orchestrator | 2026-01-06 02:00:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:19.690289 | orchestrator | 2026-01-06 02:00:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:19.690352 | orchestrator | 2026-01-06 02:00:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:22.743336 | orchestrator | 2026-01-06 02:00:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:22.745085 | orchestrator | 2026-01-06 02:00:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:22.745134 | orchestrator | 2026-01-06 02:00:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:25.794008 | orchestrator | 2026-01-06 02:00:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:25.797055 | orchestrator | 2026-01-06 02:00:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:25.797129 | orchestrator | 2026-01-06 02:00:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:28.852206 | orchestrator | 2026-01-06 02:00:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:28.855679 | orchestrator | 2026-01-06 02:00:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:28.855745 | orchestrator | 2026-01-06 02:00:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:31.907543 | orchestrator | 2026-01-06 02:00:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:31.916057 | orchestrator | 2026-01-06 02:00:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:31.916157 | orchestrator | 2026-01-06 02:00:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:34.966258 | orchestrator | 2026-01-06 02:00:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:34.969084 | orchestrator | 2026-01-06 02:00:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:34.969224 | orchestrator | 2026-01-06 02:00:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:38.014316 | orchestrator | 2026-01-06 02:00:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:38.016532 | orchestrator | 2026-01-06 02:00:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:38.016611 | orchestrator | 2026-01-06 02:00:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:41.061020 | orchestrator | 2026-01-06 02:00:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:41.062535 | orchestrator | 2026-01-06 02:00:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:41.062801 | orchestrator | 2026-01-06 02:00:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:44.109544 | orchestrator | 2026-01-06 02:00:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:44.112389 | orchestrator | 2026-01-06 02:00:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:44.112466 | orchestrator | 2026-01-06 02:00:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:47.154231 | orchestrator | 2026-01-06 02:00:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:47.156139 | orchestrator | 2026-01-06 02:00:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:47.156200 | orchestrator | 2026-01-06 02:00:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:50.200656 | orchestrator | 2026-01-06 02:00:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:50.202200 | orchestrator | 2026-01-06 02:00:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:50.202254 | orchestrator | 2026-01-06 02:00:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:53.250011 | orchestrator | 2026-01-06 02:00:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:53.251501 | orchestrator | 2026-01-06 02:00:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:53.251577 | orchestrator | 2026-01-06 02:00:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:56.297677 | orchestrator | 2026-01-06 02:00:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:56.298733 | orchestrator | 2026-01-06 02:00:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:56.298787 | orchestrator | 2026-01-06 02:00:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:00:59.344853 | orchestrator | 2026-01-06 02:00:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:00:59.346458 | orchestrator | 2026-01-06 02:00:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:00:59.346496 | orchestrator | 2026-01-06 02:00:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:02.391660 | orchestrator | 2026-01-06 02:01:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:02.392647 | orchestrator | 2026-01-06 02:01:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:02.392756 | orchestrator | 2026-01-06 02:01:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:05.440499 | orchestrator | 2026-01-06 02:01:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:05.442185 | orchestrator | 2026-01-06 02:01:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:05.442249 | orchestrator | 2026-01-06 02:01:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:08.491312 | orchestrator | 2026-01-06 02:01:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:08.494180 | orchestrator | 2026-01-06 02:01:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:08.494287 | orchestrator | 2026-01-06 02:01:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:11.534773 | orchestrator | 2026-01-06 02:01:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:11.536183 | orchestrator | 2026-01-06 02:01:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:11.536239 | orchestrator | 2026-01-06 02:01:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:14.581561 | orchestrator | 2026-01-06 02:01:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:14.583164 | orchestrator | 2026-01-06 02:01:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:14.583404 | orchestrator | 2026-01-06 02:01:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:17.640486 | orchestrator | 2026-01-06 02:01:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:17.643181 | orchestrator | 2026-01-06 02:01:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:17.643416 | orchestrator | 2026-01-06 02:01:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:20.693374 | orchestrator | 2026-01-06 02:01:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:20.695060 | orchestrator | 2026-01-06 02:01:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:20.695096 | orchestrator | 2026-01-06 02:01:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:23.751268 | orchestrator | 2026-01-06 02:01:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:23.753256 | orchestrator | 2026-01-06 02:01:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:23.753324 | orchestrator | 2026-01-06 02:01:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:26.803033 | orchestrator | 2026-01-06 02:01:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:26.805344 | orchestrator | 2026-01-06 02:01:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:26.805402 | orchestrator | 2026-01-06 02:01:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:29.849321 | orchestrator | 2026-01-06 02:01:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:29.850659 | orchestrator | 2026-01-06 02:01:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:29.850782 | orchestrator | 2026-01-06 02:01:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:32.896805 | orchestrator | 2026-01-06 02:01:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:32.899121 | orchestrator | 2026-01-06 02:01:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:32.899183 | orchestrator | 2026-01-06 02:01:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:35.951202 | orchestrator | 2026-01-06 02:01:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:35.953472 | orchestrator | 2026-01-06 02:01:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:35.953520 | orchestrator | 2026-01-06 02:01:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:39.002488 | orchestrator | 2026-01-06 02:01:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:39.012385 | orchestrator | 2026-01-06 02:01:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:39.012511 | orchestrator | 2026-01-06 02:01:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:42.060240 | orchestrator | 2026-01-06 02:01:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:42.061853 | orchestrator | 2026-01-06 02:01:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:42.061905 | orchestrator | 2026-01-06 02:01:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:45.106147 | orchestrator | 2026-01-06 02:01:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:45.107987 | orchestrator | 2026-01-06 02:01:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:45.108039 | orchestrator | 2026-01-06 02:01:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:48.159624 | orchestrator | 2026-01-06 02:01:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:48.161135 | orchestrator | 2026-01-06 02:01:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:48.161167 | orchestrator | 2026-01-06 02:01:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:51.210356 | orchestrator | 2026-01-06 02:01:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:51.212045 | orchestrator | 2026-01-06 02:01:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:51.212106 | orchestrator | 2026-01-06 02:01:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:54.258519 | orchestrator | 2026-01-06 02:01:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:54.259025 | orchestrator | 2026-01-06 02:01:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:54.259063 | orchestrator | 2026-01-06 02:01:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:01:57.303463 | orchestrator | 2026-01-06 02:01:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:01:57.305093 | orchestrator | 2026-01-06 02:01:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:01:57.305116 | orchestrator | 2026-01-06 02:01:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:00.352479 | orchestrator | 2026-01-06 02:02:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:00.354854 | orchestrator | 2026-01-06 02:02:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:00.354912 | orchestrator | 2026-01-06 02:02:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:03.409253 | orchestrator | 2026-01-06 02:02:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:03.412055 | orchestrator | 2026-01-06 02:02:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:03.412181 | orchestrator | 2026-01-06 02:02:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:06.449419 | orchestrator | 2026-01-06 02:02:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:06.451132 | orchestrator | 2026-01-06 02:02:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:06.451253 | orchestrator | 2026-01-06 02:02:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:09.497977 | orchestrator | 2026-01-06 02:02:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:09.501129 | orchestrator | 2026-01-06 02:02:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:09.501177 | orchestrator | 2026-01-06 02:02:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:12.545290 | orchestrator | 2026-01-06 02:02:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:12.549112 | orchestrator | 2026-01-06 02:02:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:12.549200 | orchestrator | 2026-01-06 02:02:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:15.591442 | orchestrator | 2026-01-06 02:02:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:15.593279 | orchestrator | 2026-01-06 02:02:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:15.593317 | orchestrator | 2026-01-06 02:02:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:18.642250 | orchestrator | 2026-01-06 02:02:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:18.644094 | orchestrator | 2026-01-06 02:02:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:18.644178 | orchestrator | 2026-01-06 02:02:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:21.690540 | orchestrator | 2026-01-06 02:02:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:21.692036 | orchestrator | 2026-01-06 02:02:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:21.692244 | orchestrator | 2026-01-06 02:02:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:24.746193 | orchestrator | 2026-01-06 02:02:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:24.746960 | orchestrator | 2026-01-06 02:02:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:24.746994 | orchestrator | 2026-01-06 02:02:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:27.797522 | orchestrator | 2026-01-06 02:02:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:27.799719 | orchestrator | 2026-01-06 02:02:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:27.799885 | orchestrator | 2026-01-06 02:02:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:30.844561 | orchestrator | 2026-01-06 02:02:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:30.847380 | orchestrator | 2026-01-06 02:02:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:30.847444 | orchestrator | 2026-01-06 02:02:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:33.895973 | orchestrator | 2026-01-06 02:02:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:33.898391 | orchestrator | 2026-01-06 02:02:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:33.898458 | orchestrator | 2026-01-06 02:02:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:36.946580 | orchestrator | 2026-01-06 02:02:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:36.948739 | orchestrator | 2026-01-06 02:02:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:36.948868 | orchestrator | 2026-01-06 02:02:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:39.993901 | orchestrator | 2026-01-06 02:02:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:39.995261 | orchestrator | 2026-01-06 02:02:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:39.995321 | orchestrator | 2026-01-06 02:02:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:43.047530 | orchestrator | 2026-01-06 02:02:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:43.048661 | orchestrator | 2026-01-06 02:02:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:43.048707 | orchestrator | 2026-01-06 02:02:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:46.098458 | orchestrator | 2026-01-06 02:02:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:46.099918 | orchestrator | 2026-01-06 02:02:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:46.100489 | orchestrator | 2026-01-06 02:02:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:49.144924 | orchestrator | 2026-01-06 02:02:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:49.145635 | orchestrator | 2026-01-06 02:02:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:49.145663 | orchestrator | 2026-01-06 02:02:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:52.193308 | orchestrator | 2026-01-06 02:02:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:52.195415 | orchestrator | 2026-01-06 02:02:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:52.195439 | orchestrator | 2026-01-06 02:02:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:55.238618 | orchestrator | 2026-01-06 02:02:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:55.239362 | orchestrator | 2026-01-06 02:02:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:55.239440 | orchestrator | 2026-01-06 02:02:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:02:58.283806 | orchestrator | 2026-01-06 02:02:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:02:58.285557 | orchestrator | 2026-01-06 02:02:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:02:58.285627 | orchestrator | 2026-01-06 02:02:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:01.335256 | orchestrator | 2026-01-06 02:03:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:01.339105 | orchestrator | 2026-01-06 02:03:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:01.339191 | orchestrator | 2026-01-06 02:03:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:04.388408 | orchestrator | 2026-01-06 02:03:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:04.388857 | orchestrator | 2026-01-06 02:03:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:04.388902 | orchestrator | 2026-01-06 02:03:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:07.436440 | orchestrator | 2026-01-06 02:03:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:07.438491 | orchestrator | 2026-01-06 02:03:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:07.438541 | orchestrator | 2026-01-06 02:03:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:10.487899 | orchestrator | 2026-01-06 02:03:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:10.489612 | orchestrator | 2026-01-06 02:03:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:10.489765 | orchestrator | 2026-01-06 02:03:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:13.538119 | orchestrator | 2026-01-06 02:03:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:13.540079 | orchestrator | 2026-01-06 02:03:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:13.540658 | orchestrator | 2026-01-06 02:03:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:16.585876 | orchestrator | 2026-01-06 02:03:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:16.587344 | orchestrator | 2026-01-06 02:03:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:16.587397 | orchestrator | 2026-01-06 02:03:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:19.635561 | orchestrator | 2026-01-06 02:03:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:19.638080 | orchestrator | 2026-01-06 02:03:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:19.638154 | orchestrator | 2026-01-06 02:03:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:22.688026 | orchestrator | 2026-01-06 02:03:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:22.689001 | orchestrator | 2026-01-06 02:03:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:22.689046 | orchestrator | 2026-01-06 02:03:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:25.735769 | orchestrator | 2026-01-06 02:03:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:25.738356 | orchestrator | 2026-01-06 02:03:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:25.738446 | orchestrator | 2026-01-06 02:03:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:28.782599 | orchestrator | 2026-01-06 02:03:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:28.783634 | orchestrator | 2026-01-06 02:03:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:28.783671 | orchestrator | 2026-01-06 02:03:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:31.833359 | orchestrator | 2026-01-06 02:03:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:31.834678 | orchestrator | 2026-01-06 02:03:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:31.834727 | orchestrator | 2026-01-06 02:03:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:34.885446 | orchestrator | 2026-01-06 02:03:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:34.886772 | orchestrator | 2026-01-06 02:03:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:34.886933 | orchestrator | 2026-01-06 02:03:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:37.931754 | orchestrator | 2026-01-06 02:03:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:37.932924 | orchestrator | 2026-01-06 02:03:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:37.933036 | orchestrator | 2026-01-06 02:03:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:40.983687 | orchestrator | 2026-01-06 02:03:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:40.985212 | orchestrator | 2026-01-06 02:03:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:40.985383 | orchestrator | 2026-01-06 02:03:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:44.034356 | orchestrator | 2026-01-06 02:03:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:44.035933 | orchestrator | 2026-01-06 02:03:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:44.036146 | orchestrator | 2026-01-06 02:03:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:47.085070 | orchestrator | 2026-01-06 02:03:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:47.085163 | orchestrator | 2026-01-06 02:03:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:47.085173 | orchestrator | 2026-01-06 02:03:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:50.129478 | orchestrator | 2026-01-06 02:03:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:50.132033 | orchestrator | 2026-01-06 02:03:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:50.132110 | orchestrator | 2026-01-06 02:03:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:53.181764 | orchestrator | 2026-01-06 02:03:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:53.185222 | orchestrator | 2026-01-06 02:03:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:53.185261 | orchestrator | 2026-01-06 02:03:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:56.226327 | orchestrator | 2026-01-06 02:03:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:56.229114 | orchestrator | 2026-01-06 02:03:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:56.229194 | orchestrator | 2026-01-06 02:03:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:03:59.276226 | orchestrator | 2026-01-06 02:03:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:03:59.277734 | orchestrator | 2026-01-06 02:03:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:03:59.277975 | orchestrator | 2026-01-06 02:03:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:02.326261 | orchestrator | 2026-01-06 02:04:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:02.328865 | orchestrator | 2026-01-06 02:04:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:02.328947 | orchestrator | 2026-01-06 02:04:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:05.380663 | orchestrator | 2026-01-06 02:04:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:05.381776 | orchestrator | 2026-01-06 02:04:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:05.382060 | orchestrator | 2026-01-06 02:04:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:08.428309 | orchestrator | 2026-01-06 02:04:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:08.428944 | orchestrator | 2026-01-06 02:04:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:08.429456 | orchestrator | 2026-01-06 02:04:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:11.481960 | orchestrator | 2026-01-06 02:04:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:11.484187 | orchestrator | 2026-01-06 02:04:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:11.484303 | orchestrator | 2026-01-06 02:04:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:14.532581 | orchestrator | 2026-01-06 02:04:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:14.534975 | orchestrator | 2026-01-06 02:04:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:14.535036 | orchestrator | 2026-01-06 02:04:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:17.581418 | orchestrator | 2026-01-06 02:04:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:17.584583 | orchestrator | 2026-01-06 02:04:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:17.584670 | orchestrator | 2026-01-06 02:04:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:20.627784 | orchestrator | 2026-01-06 02:04:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:20.629342 | orchestrator | 2026-01-06 02:04:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:20.629414 | orchestrator | 2026-01-06 02:04:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:23.676441 | orchestrator | 2026-01-06 02:04:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:23.677955 | orchestrator | 2026-01-06 02:04:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:23.678006 | orchestrator | 2026-01-06 02:04:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:26.730275 | orchestrator | 2026-01-06 02:04:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:26.732223 | orchestrator | 2026-01-06 02:04:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:26.732385 | orchestrator | 2026-01-06 02:04:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:29.778955 | orchestrator | 2026-01-06 02:04:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:29.780436 | orchestrator | 2026-01-06 02:04:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:29.781208 | orchestrator | 2026-01-06 02:04:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:32.830107 | orchestrator | 2026-01-06 02:04:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:32.831354 | orchestrator | 2026-01-06 02:04:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:32.831400 | orchestrator | 2026-01-06 02:04:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:35.876548 | orchestrator | 2026-01-06 02:04:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:35.878215 | orchestrator | 2026-01-06 02:04:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:35.878273 | orchestrator | 2026-01-06 02:04:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:38.921664 | orchestrator | 2026-01-06 02:04:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:38.923285 | orchestrator | 2026-01-06 02:04:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:38.923320 | orchestrator | 2026-01-06 02:04:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:41.972216 | orchestrator | 2026-01-06 02:04:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:41.973019 | orchestrator | 2026-01-06 02:04:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:41.973066 | orchestrator | 2026-01-06 02:04:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:45.034935 | orchestrator | 2026-01-06 02:04:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:45.036588 | orchestrator | 2026-01-06 02:04:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:45.036638 | orchestrator | 2026-01-06 02:04:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:48.082965 | orchestrator | 2026-01-06 02:04:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:48.085251 | orchestrator | 2026-01-06 02:04:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:48.085309 | orchestrator | 2026-01-06 02:04:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:51.128164 | orchestrator | 2026-01-06 02:04:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:51.129668 | orchestrator | 2026-01-06 02:04:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:51.129905 | orchestrator | 2026-01-06 02:04:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:54.184295 | orchestrator | 2026-01-06 02:04:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:54.184893 | orchestrator | 2026-01-06 02:04:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:54.184914 | orchestrator | 2026-01-06 02:04:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:04:57.233004 | orchestrator | 2026-01-06 02:04:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:04:57.235130 | orchestrator | 2026-01-06 02:04:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:04:57.235217 | orchestrator | 2026-01-06 02:04:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:00.282509 | orchestrator | 2026-01-06 02:05:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:00.284775 | orchestrator | 2026-01-06 02:05:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:00.284915 | orchestrator | 2026-01-06 02:05:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:03.334783 | orchestrator | 2026-01-06 02:05:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:03.337464 | orchestrator | 2026-01-06 02:05:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:03.337516 | orchestrator | 2026-01-06 02:05:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:06.389792 | orchestrator | 2026-01-06 02:05:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:06.390913 | orchestrator | 2026-01-06 02:05:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:06.391132 | orchestrator | 2026-01-06 02:05:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:09.436926 | orchestrator | 2026-01-06 02:05:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:09.437294 | orchestrator | 2026-01-06 02:05:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:09.437329 | orchestrator | 2026-01-06 02:05:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:12.483410 | orchestrator | 2026-01-06 02:05:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:12.485715 | orchestrator | 2026-01-06 02:05:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:12.485949 | orchestrator | 2026-01-06 02:05:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:15.531231 | orchestrator | 2026-01-06 02:05:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:15.532068 | orchestrator | 2026-01-06 02:05:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:15.532091 | orchestrator | 2026-01-06 02:05:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:18.582368 | orchestrator | 2026-01-06 02:05:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:18.583945 | orchestrator | 2026-01-06 02:05:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:18.584008 | orchestrator | 2026-01-06 02:05:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:21.634906 | orchestrator | 2026-01-06 02:05:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:21.637589 | orchestrator | 2026-01-06 02:05:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:21.637663 | orchestrator | 2026-01-06 02:05:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:24.682138 | orchestrator | 2026-01-06 02:05:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:24.683356 | orchestrator | 2026-01-06 02:05:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:24.683495 | orchestrator | 2026-01-06 02:05:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:27.730977 | orchestrator | 2026-01-06 02:05:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:27.734249 | orchestrator | 2026-01-06 02:05:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:27.734306 | orchestrator | 2026-01-06 02:05:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:30.781005 | orchestrator | 2026-01-06 02:05:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:30.783240 | orchestrator | 2026-01-06 02:05:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:30.783303 | orchestrator | 2026-01-06 02:05:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:33.830710 | orchestrator | 2026-01-06 02:05:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:33.832774 | orchestrator | 2026-01-06 02:05:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:33.832813 | orchestrator | 2026-01-06 02:05:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:36.873975 | orchestrator | 2026-01-06 02:05:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:36.876290 | orchestrator | 2026-01-06 02:05:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:36.876366 | orchestrator | 2026-01-06 02:05:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:39.921520 | orchestrator | 2026-01-06 02:05:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:39.924779 | orchestrator | 2026-01-06 02:05:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:39.924839 | orchestrator | 2026-01-06 02:05:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:42.973624 | orchestrator | 2026-01-06 02:05:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:42.977588 | orchestrator | 2026-01-06 02:05:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:42.977651 | orchestrator | 2026-01-06 02:05:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:46.027124 | orchestrator | 2026-01-06 02:05:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:46.029049 | orchestrator | 2026-01-06 02:05:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:46.029132 | orchestrator | 2026-01-06 02:05:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:49.074618 | orchestrator | 2026-01-06 02:05:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:49.075581 | orchestrator | 2026-01-06 02:05:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:49.075741 | orchestrator | 2026-01-06 02:05:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:52.120514 | orchestrator | 2026-01-06 02:05:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:52.122664 | orchestrator | 2026-01-06 02:05:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:52.122742 | orchestrator | 2026-01-06 02:05:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:55.171464 | orchestrator | 2026-01-06 02:05:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:55.172636 | orchestrator | 2026-01-06 02:05:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:55.172680 | orchestrator | 2026-01-06 02:05:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:05:58.220655 | orchestrator | 2026-01-06 02:05:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:05:58.222787 | orchestrator | 2026-01-06 02:05:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:05:58.222853 | orchestrator | 2026-01-06 02:05:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:01.270946 | orchestrator | 2026-01-06 02:06:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:01.272269 | orchestrator | 2026-01-06 02:06:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:01.272337 | orchestrator | 2026-01-06 02:06:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:04.318011 | orchestrator | 2026-01-06 02:06:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:04.319528 | orchestrator | 2026-01-06 02:06:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:04.319716 | orchestrator | 2026-01-06 02:06:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:07.368787 | orchestrator | 2026-01-06 02:06:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:07.370721 | orchestrator | 2026-01-06 02:06:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:07.370815 | orchestrator | 2026-01-06 02:06:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:10.417629 | orchestrator | 2026-01-06 02:06:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:10.419531 | orchestrator | 2026-01-06 02:06:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:10.419608 | orchestrator | 2026-01-06 02:06:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:13.465849 | orchestrator | 2026-01-06 02:06:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:13.468083 | orchestrator | 2026-01-06 02:06:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:13.468175 | orchestrator | 2026-01-06 02:06:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:16.519426 | orchestrator | 2026-01-06 02:06:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:16.520814 | orchestrator | 2026-01-06 02:06:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:16.520879 | orchestrator | 2026-01-06 02:06:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:19.571177 | orchestrator | 2026-01-06 02:06:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:19.573491 | orchestrator | 2026-01-06 02:06:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:19.573551 | orchestrator | 2026-01-06 02:06:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:22.618793 | orchestrator | 2026-01-06 02:06:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:22.620633 | orchestrator | 2026-01-06 02:06:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:22.621229 | orchestrator | 2026-01-06 02:06:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:25.666604 | orchestrator | 2026-01-06 02:06:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:25.668180 | orchestrator | 2026-01-06 02:06:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:25.668343 | orchestrator | 2026-01-06 02:06:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:28.720693 | orchestrator | 2026-01-06 02:06:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:28.723065 | orchestrator | 2026-01-06 02:06:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:28.723123 | orchestrator | 2026-01-06 02:06:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:31.771786 | orchestrator | 2026-01-06 02:06:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:31.772781 | orchestrator | 2026-01-06 02:06:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:31.772856 | orchestrator | 2026-01-06 02:06:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:34.820426 | orchestrator | 2026-01-06 02:06:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:34.821189 | orchestrator | 2026-01-06 02:06:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:34.821886 | orchestrator | 2026-01-06 02:06:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:37.868100 | orchestrator | 2026-01-06 02:06:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:37.869648 | orchestrator | 2026-01-06 02:06:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:37.869699 | orchestrator | 2026-01-06 02:06:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:40.915733 | orchestrator | 2026-01-06 02:06:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:40.918151 | orchestrator | 2026-01-06 02:06:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:40.918216 | orchestrator | 2026-01-06 02:06:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:43.962229 | orchestrator | 2026-01-06 02:06:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:43.964010 | orchestrator | 2026-01-06 02:06:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:43.964039 | orchestrator | 2026-01-06 02:06:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:47.017571 | orchestrator | 2026-01-06 02:06:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:47.020022 | orchestrator | 2026-01-06 02:06:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:47.020108 | orchestrator | 2026-01-06 02:06:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:50.064753 | orchestrator | 2026-01-06 02:06:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:50.066524 | orchestrator | 2026-01-06 02:06:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:50.066626 | orchestrator | 2026-01-06 02:06:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:53.120183 | orchestrator | 2026-01-06 02:06:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:53.121803 | orchestrator | 2026-01-06 02:06:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:53.121858 | orchestrator | 2026-01-06 02:06:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:56.171389 | orchestrator | 2026-01-06 02:06:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:56.173847 | orchestrator | 2026-01-06 02:06:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:56.173948 | orchestrator | 2026-01-06 02:06:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:06:59.219986 | orchestrator | 2026-01-06 02:06:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:06:59.221535 | orchestrator | 2026-01-06 02:06:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:06:59.221588 | orchestrator | 2026-01-06 02:06:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:02.271543 | orchestrator | 2026-01-06 02:07:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:02.273378 | orchestrator | 2026-01-06 02:07:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:02.273439 | orchestrator | 2026-01-06 02:07:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:05.322309 | orchestrator | 2026-01-06 02:07:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:05.324388 | orchestrator | 2026-01-06 02:07:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:05.324459 | orchestrator | 2026-01-06 02:07:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:08.368631 | orchestrator | 2026-01-06 02:07:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:08.371645 | orchestrator | 2026-01-06 02:07:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:08.371716 | orchestrator | 2026-01-06 02:07:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:11.417263 | orchestrator | 2026-01-06 02:07:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:11.420904 | orchestrator | 2026-01-06 02:07:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:11.421074 | orchestrator | 2026-01-06 02:07:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:14.464322 | orchestrator | 2026-01-06 02:07:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:14.466374 | orchestrator | 2026-01-06 02:07:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:14.466543 | orchestrator | 2026-01-06 02:07:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:17.519174 | orchestrator | 2026-01-06 02:07:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:17.519375 | orchestrator | 2026-01-06 02:07:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:17.519745 | orchestrator | 2026-01-06 02:07:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:20.569054 | orchestrator | 2026-01-06 02:07:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:20.572227 | orchestrator | 2026-01-06 02:07:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:20.572311 | orchestrator | 2026-01-06 02:07:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:23.624746 | orchestrator | 2026-01-06 02:07:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:23.626974 | orchestrator | 2026-01-06 02:07:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:23.627028 | orchestrator | 2026-01-06 02:07:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:26.676806 | orchestrator | 2026-01-06 02:07:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:26.679089 | orchestrator | 2026-01-06 02:07:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:26.679182 | orchestrator | 2026-01-06 02:07:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:29.728522 | orchestrator | 2026-01-06 02:07:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:29.730669 | orchestrator | 2026-01-06 02:07:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:29.730724 | orchestrator | 2026-01-06 02:07:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:32.786179 | orchestrator | 2026-01-06 02:07:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:32.787813 | orchestrator | 2026-01-06 02:07:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:32.787846 | orchestrator | 2026-01-06 02:07:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:35.831729 | orchestrator | 2026-01-06 02:07:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:35.833183 | orchestrator | 2026-01-06 02:07:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:35.833239 | orchestrator | 2026-01-06 02:07:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:38.880643 | orchestrator | 2026-01-06 02:07:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:38.881780 | orchestrator | 2026-01-06 02:07:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:38.881811 | orchestrator | 2026-01-06 02:07:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:41.930171 | orchestrator | 2026-01-06 02:07:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:41.932311 | orchestrator | 2026-01-06 02:07:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:41.932401 | orchestrator | 2026-01-06 02:07:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:44.983388 | orchestrator | 2026-01-06 02:07:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:44.985776 | orchestrator | 2026-01-06 02:07:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:44.985847 | orchestrator | 2026-01-06 02:07:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:48.032911 | orchestrator | 2026-01-06 02:07:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:48.034088 | orchestrator | 2026-01-06 02:07:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:48.034126 | orchestrator | 2026-01-06 02:07:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:51.077500 | orchestrator | 2026-01-06 02:07:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:51.080278 | orchestrator | 2026-01-06 02:07:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:51.080347 | orchestrator | 2026-01-06 02:07:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:54.121476 | orchestrator | 2026-01-06 02:07:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:54.122496 | orchestrator | 2026-01-06 02:07:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:54.122581 | orchestrator | 2026-01-06 02:07:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:07:57.166556 | orchestrator | 2026-01-06 02:07:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:07:57.169165 | orchestrator | 2026-01-06 02:07:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:07:57.169247 | orchestrator | 2026-01-06 02:07:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:00.213136 | orchestrator | 2026-01-06 02:08:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:00.213527 | orchestrator | 2026-01-06 02:08:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:00.213681 | orchestrator | 2026-01-06 02:08:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:03.254542 | orchestrator | 2026-01-06 02:08:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:03.255748 | orchestrator | 2026-01-06 02:08:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:03.255776 | orchestrator | 2026-01-06 02:08:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:06.299822 | orchestrator | 2026-01-06 02:08:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:06.300817 | orchestrator | 2026-01-06 02:08:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:06.300953 | orchestrator | 2026-01-06 02:08:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:09.342696 | orchestrator | 2026-01-06 02:08:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:09.343834 | orchestrator | 2026-01-06 02:08:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:09.344526 | orchestrator | 2026-01-06 02:08:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:12.391346 | orchestrator | 2026-01-06 02:08:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:12.392844 | orchestrator | 2026-01-06 02:08:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:12.392892 | orchestrator | 2026-01-06 02:08:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:15.439608 | orchestrator | 2026-01-06 02:08:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:15.442929 | orchestrator | 2026-01-06 02:08:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:15.443043 | orchestrator | 2026-01-06 02:08:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:18.496792 | orchestrator | 2026-01-06 02:08:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:18.498690 | orchestrator | 2026-01-06 02:08:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:18.498826 | orchestrator | 2026-01-06 02:08:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:21.546666 | orchestrator | 2026-01-06 02:08:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:21.548602 | orchestrator | 2026-01-06 02:08:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:21.548654 | orchestrator | 2026-01-06 02:08:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:24.595868 | orchestrator | 2026-01-06 02:08:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:24.596251 | orchestrator | 2026-01-06 02:08:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:24.596278 | orchestrator | 2026-01-06 02:08:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:27.649107 | orchestrator | 2026-01-06 02:08:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:27.650821 | orchestrator | 2026-01-06 02:08:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:27.650907 | orchestrator | 2026-01-06 02:08:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:30.691231 | orchestrator | 2026-01-06 02:08:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:30.692235 | orchestrator | 2026-01-06 02:08:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:30.692251 | orchestrator | 2026-01-06 02:08:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:33.735569 | orchestrator | 2026-01-06 02:08:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:33.738355 | orchestrator | 2026-01-06 02:08:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:33.738440 | orchestrator | 2026-01-06 02:08:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:36.777922 | orchestrator | 2026-01-06 02:08:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:36.780086 | orchestrator | 2026-01-06 02:08:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:36.780182 | orchestrator | 2026-01-06 02:08:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:39.826788 | orchestrator | 2026-01-06 02:08:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:39.828135 | orchestrator | 2026-01-06 02:08:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:39.828214 | orchestrator | 2026-01-06 02:08:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:42.876780 | orchestrator | 2026-01-06 02:08:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:42.877726 | orchestrator | 2026-01-06 02:08:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:42.877784 | orchestrator | 2026-01-06 02:08:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:45.920264 | orchestrator | 2026-01-06 02:08:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:45.922112 | orchestrator | 2026-01-06 02:08:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:45.922290 | orchestrator | 2026-01-06 02:08:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:48.972774 | orchestrator | 2026-01-06 02:08:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:48.974787 | orchestrator | 2026-01-06 02:08:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:48.974841 | orchestrator | 2026-01-06 02:08:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:52.027364 | orchestrator | 2026-01-06 02:08:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:52.032326 | orchestrator | 2026-01-06 02:08:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:52.032426 | orchestrator | 2026-01-06 02:08:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:55.079129 | orchestrator | 2026-01-06 02:08:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:55.081124 | orchestrator | 2026-01-06 02:08:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:55.081163 | orchestrator | 2026-01-06 02:08:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:08:58.124802 | orchestrator | 2026-01-06 02:08:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:08:58.126744 | orchestrator | 2026-01-06 02:08:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:08:58.126817 | orchestrator | 2026-01-06 02:08:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:01.173846 | orchestrator | 2026-01-06 02:09:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:01.174351 | orchestrator | 2026-01-06 02:09:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:01.174403 | orchestrator | 2026-01-06 02:09:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:04.222442 | orchestrator | 2026-01-06 02:09:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:04.224226 | orchestrator | 2026-01-06 02:09:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:04.224256 | orchestrator | 2026-01-06 02:09:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:07.272434 | orchestrator | 2026-01-06 02:09:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:07.273519 | orchestrator | 2026-01-06 02:09:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:07.273896 | orchestrator | 2026-01-06 02:09:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:10.318330 | orchestrator | 2026-01-06 02:09:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:10.319822 | orchestrator | 2026-01-06 02:09:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:10.319868 | orchestrator | 2026-01-06 02:09:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:13.369483 | orchestrator | 2026-01-06 02:09:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:13.371604 | orchestrator | 2026-01-06 02:09:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:13.371683 | orchestrator | 2026-01-06 02:09:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:16.421241 | orchestrator | 2026-01-06 02:09:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:16.422424 | orchestrator | 2026-01-06 02:09:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:16.422821 | orchestrator | 2026-01-06 02:09:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:19.472311 | orchestrator | 2026-01-06 02:09:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:19.474454 | orchestrator | 2026-01-06 02:09:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:19.474560 | orchestrator | 2026-01-06 02:09:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:22.521168 | orchestrator | 2026-01-06 02:09:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:22.522400 | orchestrator | 2026-01-06 02:09:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:22.522505 | orchestrator | 2026-01-06 02:09:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:25.568568 | orchestrator | 2026-01-06 02:09:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:25.570278 | orchestrator | 2026-01-06 02:09:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:25.570346 | orchestrator | 2026-01-06 02:09:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:28.620377 | orchestrator | 2026-01-06 02:09:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:28.622510 | orchestrator | 2026-01-06 02:09:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:28.622577 | orchestrator | 2026-01-06 02:09:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:31.669663 | orchestrator | 2026-01-06 02:09:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:31.671221 | orchestrator | 2026-01-06 02:09:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:31.671274 | orchestrator | 2026-01-06 02:09:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:34.714607 | orchestrator | 2026-01-06 02:09:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:34.716073 | orchestrator | 2026-01-06 02:09:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:34.716158 | orchestrator | 2026-01-06 02:09:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:37.761539 | orchestrator | 2026-01-06 02:09:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:37.763732 | orchestrator | 2026-01-06 02:09:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:37.763795 | orchestrator | 2026-01-06 02:09:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:40.810467 | orchestrator | 2026-01-06 02:09:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:40.812835 | orchestrator | 2026-01-06 02:09:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:40.813249 | orchestrator | 2026-01-06 02:09:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:43.861539 | orchestrator | 2026-01-06 02:09:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:43.863864 | orchestrator | 2026-01-06 02:09:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:43.863906 | orchestrator | 2026-01-06 02:09:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:46.914260 | orchestrator | 2026-01-06 02:09:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:46.915417 | orchestrator | 2026-01-06 02:09:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:46.915485 | orchestrator | 2026-01-06 02:09:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:49.960413 | orchestrator | 2026-01-06 02:09:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:49.962929 | orchestrator | 2026-01-06 02:09:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:49.963039 | orchestrator | 2026-01-06 02:09:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:53.010198 | orchestrator | 2026-01-06 02:09:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:53.011653 | orchestrator | 2026-01-06 02:09:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:53.011857 | orchestrator | 2026-01-06 02:09:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:56.065417 | orchestrator | 2026-01-06 02:09:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:56.066638 | orchestrator | 2026-01-06 02:09:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:56.066673 | orchestrator | 2026-01-06 02:09:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:09:59.109147 | orchestrator | 2026-01-06 02:09:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:09:59.110678 | orchestrator | 2026-01-06 02:09:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:09:59.110734 | orchestrator | 2026-01-06 02:09:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:02.158516 | orchestrator | 2026-01-06 02:10:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:02.160290 | orchestrator | 2026-01-06 02:10:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:02.160342 | orchestrator | 2026-01-06 02:10:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:05.199860 | orchestrator | 2026-01-06 02:10:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:05.201272 | orchestrator | 2026-01-06 02:10:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:05.201328 | orchestrator | 2026-01-06 02:10:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:08.248835 | orchestrator | 2026-01-06 02:10:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:08.250450 | orchestrator | 2026-01-06 02:10:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:08.250557 | orchestrator | 2026-01-06 02:10:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:11.298616 | orchestrator | 2026-01-06 02:10:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:11.300932 | orchestrator | 2026-01-06 02:10:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:11.301033 | orchestrator | 2026-01-06 02:10:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:14.346850 | orchestrator | 2026-01-06 02:10:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:14.349263 | orchestrator | 2026-01-06 02:10:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:14.349363 | orchestrator | 2026-01-06 02:10:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:17.396039 | orchestrator | 2026-01-06 02:10:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:17.397355 | orchestrator | 2026-01-06 02:10:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:17.397417 | orchestrator | 2026-01-06 02:10:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:20.439375 | orchestrator | 2026-01-06 02:10:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:20.441618 | orchestrator | 2026-01-06 02:10:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:20.441720 | orchestrator | 2026-01-06 02:10:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:23.487706 | orchestrator | 2026-01-06 02:10:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:23.490294 | orchestrator | 2026-01-06 02:10:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:23.490450 | orchestrator | 2026-01-06 02:10:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:26.539086 | orchestrator | 2026-01-06 02:10:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:26.540672 | orchestrator | 2026-01-06 02:10:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:26.540787 | orchestrator | 2026-01-06 02:10:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:29.593115 | orchestrator | 2026-01-06 02:10:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:29.594521 | orchestrator | 2026-01-06 02:10:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:29.594635 | orchestrator | 2026-01-06 02:10:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:32.644924 | orchestrator | 2026-01-06 02:10:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:32.647103 | orchestrator | 2026-01-06 02:10:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:32.647136 | orchestrator | 2026-01-06 02:10:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:35.699841 | orchestrator | 2026-01-06 02:10:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:35.701373 | orchestrator | 2026-01-06 02:10:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:35.701425 | orchestrator | 2026-01-06 02:10:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:38.752328 | orchestrator | 2026-01-06 02:10:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:38.758079 | orchestrator | 2026-01-06 02:10:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:38.758167 | orchestrator | 2026-01-06 02:10:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:41.809095 | orchestrator | 2026-01-06 02:10:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:41.810572 | orchestrator | 2026-01-06 02:10:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:41.810680 | orchestrator | 2026-01-06 02:10:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:44.863113 | orchestrator | 2026-01-06 02:10:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:44.865650 | orchestrator | 2026-01-06 02:10:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:44.865726 | orchestrator | 2026-01-06 02:10:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:47.915480 | orchestrator | 2026-01-06 02:10:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:47.917183 | orchestrator | 2026-01-06 02:10:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:47.917256 | orchestrator | 2026-01-06 02:10:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:50.964145 | orchestrator | 2026-01-06 02:10:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:50.966529 | orchestrator | 2026-01-06 02:10:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:50.966616 | orchestrator | 2026-01-06 02:10:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:54.023099 | orchestrator | 2026-01-06 02:10:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:54.023242 | orchestrator | 2026-01-06 02:10:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:54.023253 | orchestrator | 2026-01-06 02:10:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:10:57.066380 | orchestrator | 2026-01-06 02:10:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:10:57.068092 | orchestrator | 2026-01-06 02:10:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:10:57.068140 | orchestrator | 2026-01-06 02:10:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:00.110297 | orchestrator | 2026-01-06 02:11:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:00.111199 | orchestrator | 2026-01-06 02:11:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:00.111223 | orchestrator | 2026-01-06 02:11:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:03.154949 | orchestrator | 2026-01-06 02:11:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:03.156511 | orchestrator | 2026-01-06 02:11:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:03.156562 | orchestrator | 2026-01-06 02:11:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:06.190471 | orchestrator | 2026-01-06 02:11:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:06.192330 | orchestrator | 2026-01-06 02:11:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:06.192404 | orchestrator | 2026-01-06 02:11:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:09.238225 | orchestrator | 2026-01-06 02:11:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:09.239610 | orchestrator | 2026-01-06 02:11:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:09.239959 | orchestrator | 2026-01-06 02:11:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:12.291542 | orchestrator | 2026-01-06 02:11:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:12.294115 | orchestrator | 2026-01-06 02:11:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:12.294172 | orchestrator | 2026-01-06 02:11:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:15.334464 | orchestrator | 2026-01-06 02:11:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:15.336245 | orchestrator | 2026-01-06 02:11:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:15.336475 | orchestrator | 2026-01-06 02:11:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:18.381828 | orchestrator | 2026-01-06 02:11:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:18.384448 | orchestrator | 2026-01-06 02:11:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:18.384491 | orchestrator | 2026-01-06 02:11:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:21.431588 | orchestrator | 2026-01-06 02:11:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:21.433830 | orchestrator | 2026-01-06 02:11:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:21.434385 | orchestrator | 2026-01-06 02:11:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:24.483386 | orchestrator | 2026-01-06 02:11:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:24.485686 | orchestrator | 2026-01-06 02:11:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:24.485884 | orchestrator | 2026-01-06 02:11:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:27.542611 | orchestrator | 2026-01-06 02:11:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:27.545382 | orchestrator | 2026-01-06 02:11:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:27.545464 | orchestrator | 2026-01-06 02:11:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:30.597620 | orchestrator | 2026-01-06 02:11:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:30.600683 | orchestrator | 2026-01-06 02:11:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:30.600753 | orchestrator | 2026-01-06 02:11:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:33.652118 | orchestrator | 2026-01-06 02:11:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:33.653870 | orchestrator | 2026-01-06 02:11:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:33.653935 | orchestrator | 2026-01-06 02:11:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:36.698341 | orchestrator | 2026-01-06 02:11:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:36.700099 | orchestrator | 2026-01-06 02:11:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:36.700139 | orchestrator | 2026-01-06 02:11:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:39.749055 | orchestrator | 2026-01-06 02:11:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:39.751222 | orchestrator | 2026-01-06 02:11:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:39.751299 | orchestrator | 2026-01-06 02:11:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:42.800602 | orchestrator | 2026-01-06 02:11:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:42.803266 | orchestrator | 2026-01-06 02:11:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:42.803331 | orchestrator | 2026-01-06 02:11:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:45.845281 | orchestrator | 2026-01-06 02:11:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:45.847668 | orchestrator | 2026-01-06 02:11:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:45.847744 | orchestrator | 2026-01-06 02:11:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:48.896612 | orchestrator | 2026-01-06 02:11:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:48.898761 | orchestrator | 2026-01-06 02:11:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:48.898799 | orchestrator | 2026-01-06 02:11:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:51.942612 | orchestrator | 2026-01-06 02:11:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:51.945044 | orchestrator | 2026-01-06 02:11:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:51.945091 | orchestrator | 2026-01-06 02:11:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:54.989141 | orchestrator | 2026-01-06 02:11:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:54.991260 | orchestrator | 2026-01-06 02:11:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:54.991312 | orchestrator | 2026-01-06 02:11:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:11:58.043499 | orchestrator | 2026-01-06 02:11:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:11:58.044736 | orchestrator | 2026-01-06 02:11:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:11:58.044938 | orchestrator | 2026-01-06 02:11:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:01.088523 | orchestrator | 2026-01-06 02:12:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:01.090343 | orchestrator | 2026-01-06 02:12:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:01.090410 | orchestrator | 2026-01-06 02:12:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:04.135962 | orchestrator | 2026-01-06 02:12:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:04.137706 | orchestrator | 2026-01-06 02:12:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:04.137788 | orchestrator | 2026-01-06 02:12:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:07.179123 | orchestrator | 2026-01-06 02:12:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:07.181203 | orchestrator | 2026-01-06 02:12:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:07.181272 | orchestrator | 2026-01-06 02:12:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:10.230723 | orchestrator | 2026-01-06 02:12:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:10.232238 | orchestrator | 2026-01-06 02:12:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:10.232297 | orchestrator | 2026-01-06 02:12:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:13.279196 | orchestrator | 2026-01-06 02:12:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:13.280833 | orchestrator | 2026-01-06 02:12:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:13.280893 | orchestrator | 2026-01-06 02:12:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:16.320878 | orchestrator | 2026-01-06 02:12:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:16.323081 | orchestrator | 2026-01-06 02:12:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:16.323175 | orchestrator | 2026-01-06 02:12:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:19.369726 | orchestrator | 2026-01-06 02:12:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:19.371663 | orchestrator | 2026-01-06 02:12:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:19.371841 | orchestrator | 2026-01-06 02:12:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:22.418623 | orchestrator | 2026-01-06 02:12:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:22.421038 | orchestrator | 2026-01-06 02:12:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:22.421099 | orchestrator | 2026-01-06 02:12:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:25.472667 | orchestrator | 2026-01-06 02:12:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:25.474715 | orchestrator | 2026-01-06 02:12:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:25.474798 | orchestrator | 2026-01-06 02:12:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:28.523662 | orchestrator | 2026-01-06 02:12:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:28.525299 | orchestrator | 2026-01-06 02:12:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:28.525407 | orchestrator | 2026-01-06 02:12:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:31.570769 | orchestrator | 2026-01-06 02:12:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:31.572457 | orchestrator | 2026-01-06 02:12:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:31.572534 | orchestrator | 2026-01-06 02:12:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:34.620916 | orchestrator | 2026-01-06 02:12:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:34.622452 | orchestrator | 2026-01-06 02:12:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:34.622539 | orchestrator | 2026-01-06 02:12:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:37.668795 | orchestrator | 2026-01-06 02:12:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:37.669777 | orchestrator | 2026-01-06 02:12:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:37.669811 | orchestrator | 2026-01-06 02:12:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:40.719952 | orchestrator | 2026-01-06 02:12:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:40.722320 | orchestrator | 2026-01-06 02:12:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:40.722436 | orchestrator | 2026-01-06 02:12:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:43.774371 | orchestrator | 2026-01-06 02:12:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:43.778080 | orchestrator | 2026-01-06 02:12:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:43.778439 | orchestrator | 2026-01-06 02:12:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:46.827727 | orchestrator | 2026-01-06 02:12:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:46.829048 | orchestrator | 2026-01-06 02:12:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:46.829121 | orchestrator | 2026-01-06 02:12:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:49.880643 | orchestrator | 2026-01-06 02:12:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:49.883105 | orchestrator | 2026-01-06 02:12:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:49.883162 | orchestrator | 2026-01-06 02:12:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:52.928078 | orchestrator | 2026-01-06 02:12:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:52.929635 | orchestrator | 2026-01-06 02:12:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:52.929744 | orchestrator | 2026-01-06 02:12:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:55.970955 | orchestrator | 2026-01-06 02:12:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:55.973278 | orchestrator | 2026-01-06 02:12:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:55.973374 | orchestrator | 2026-01-06 02:12:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:12:59.018610 | orchestrator | 2026-01-06 02:12:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:12:59.020198 | orchestrator | 2026-01-06 02:12:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:12:59.020320 | orchestrator | 2026-01-06 02:12:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:02.065940 | orchestrator | 2026-01-06 02:13:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:02.067158 | orchestrator | 2026-01-06 02:13:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:02.067838 | orchestrator | 2026-01-06 02:13:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:05.114196 | orchestrator | 2026-01-06 02:13:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:05.116117 | orchestrator | 2026-01-06 02:13:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:05.116161 | orchestrator | 2026-01-06 02:13:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:08.162991 | orchestrator | 2026-01-06 02:13:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:08.165280 | orchestrator | 2026-01-06 02:13:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:08.165304 | orchestrator | 2026-01-06 02:13:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:11.209226 | orchestrator | 2026-01-06 02:13:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:11.211260 | orchestrator | 2026-01-06 02:13:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:11.211818 | orchestrator | 2026-01-06 02:13:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:14.261954 | orchestrator | 2026-01-06 02:13:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:14.263670 | orchestrator | 2026-01-06 02:13:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:14.263708 | orchestrator | 2026-01-06 02:13:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:17.311101 | orchestrator | 2026-01-06 02:13:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:17.313666 | orchestrator | 2026-01-06 02:13:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:17.313763 | orchestrator | 2026-01-06 02:13:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:20.364563 | orchestrator | 2026-01-06 02:13:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:20.366296 | orchestrator | 2026-01-06 02:13:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:20.366370 | orchestrator | 2026-01-06 02:13:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:23.413381 | orchestrator | 2026-01-06 02:13:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:23.415646 | orchestrator | 2026-01-06 02:13:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:23.415726 | orchestrator | 2026-01-06 02:13:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:26.453467 | orchestrator | 2026-01-06 02:13:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:26.454992 | orchestrator | 2026-01-06 02:13:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:26.455052 | orchestrator | 2026-01-06 02:13:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:29.497842 | orchestrator | 2026-01-06 02:13:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:29.500331 | orchestrator | 2026-01-06 02:13:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:29.500548 | orchestrator | 2026-01-06 02:13:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:32.546704 | orchestrator | 2026-01-06 02:13:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:32.548732 | orchestrator | 2026-01-06 02:13:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:32.548789 | orchestrator | 2026-01-06 02:13:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:35.589490 | orchestrator | 2026-01-06 02:13:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:35.590728 | orchestrator | 2026-01-06 02:13:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:35.590764 | orchestrator | 2026-01-06 02:13:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:38.635484 | orchestrator | 2026-01-06 02:13:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:38.636966 | orchestrator | 2026-01-06 02:13:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:38.637014 | orchestrator | 2026-01-06 02:13:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:41.678488 | orchestrator | 2026-01-06 02:13:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:41.680296 | orchestrator | 2026-01-06 02:13:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:41.680353 | orchestrator | 2026-01-06 02:13:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:44.728973 | orchestrator | 2026-01-06 02:13:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:44.731211 | orchestrator | 2026-01-06 02:13:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:44.731269 | orchestrator | 2026-01-06 02:13:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:47.773598 | orchestrator | 2026-01-06 02:13:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:47.774549 | orchestrator | 2026-01-06 02:13:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:47.774913 | orchestrator | 2026-01-06 02:13:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:50.818980 | orchestrator | 2026-01-06 02:13:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:50.820793 | orchestrator | 2026-01-06 02:13:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:50.820869 | orchestrator | 2026-01-06 02:13:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:53.870405 | orchestrator | 2026-01-06 02:13:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:53.872851 | orchestrator | 2026-01-06 02:13:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:53.872958 | orchestrator | 2026-01-06 02:13:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:56.915555 | orchestrator | 2026-01-06 02:13:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:56.916982 | orchestrator | 2026-01-06 02:13:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:56.917106 | orchestrator | 2026-01-06 02:13:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:13:59.973645 | orchestrator | 2026-01-06 02:13:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:13:59.976354 | orchestrator | 2026-01-06 02:13:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:13:59.976403 | orchestrator | 2026-01-06 02:13:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:03.024604 | orchestrator | 2026-01-06 02:14:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:03.026632 | orchestrator | 2026-01-06 02:14:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:03.026712 | orchestrator | 2026-01-06 02:14:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:06.082841 | orchestrator | 2026-01-06 02:14:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:06.084669 | orchestrator | 2026-01-06 02:14:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:06.084719 | orchestrator | 2026-01-06 02:14:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:09.131978 | orchestrator | 2026-01-06 02:14:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:09.134077 | orchestrator | 2026-01-06 02:14:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:09.134136 | orchestrator | 2026-01-06 02:14:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:12.181642 | orchestrator | 2026-01-06 02:14:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:12.184173 | orchestrator | 2026-01-06 02:14:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:12.184222 | orchestrator | 2026-01-06 02:14:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:15.233974 | orchestrator | 2026-01-06 02:14:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:15.236073 | orchestrator | 2026-01-06 02:14:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:15.236145 | orchestrator | 2026-01-06 02:14:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:18.283438 | orchestrator | 2026-01-06 02:14:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:18.284756 | orchestrator | 2026-01-06 02:14:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:18.284812 | orchestrator | 2026-01-06 02:14:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:21.333717 | orchestrator | 2026-01-06 02:14:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:21.334639 | orchestrator | 2026-01-06 02:14:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:21.334675 | orchestrator | 2026-01-06 02:14:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:24.387250 | orchestrator | 2026-01-06 02:14:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:24.389154 | orchestrator | 2026-01-06 02:14:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:24.389231 | orchestrator | 2026-01-06 02:14:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:27.436690 | orchestrator | 2026-01-06 02:14:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:27.438576 | orchestrator | 2026-01-06 02:14:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:27.438629 | orchestrator | 2026-01-06 02:14:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:30.488493 | orchestrator | 2026-01-06 02:14:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:30.489684 | orchestrator | 2026-01-06 02:14:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:30.489731 | orchestrator | 2026-01-06 02:14:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:33.535108 | orchestrator | 2026-01-06 02:14:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:33.536607 | orchestrator | 2026-01-06 02:14:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:33.536644 | orchestrator | 2026-01-06 02:14:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:36.580630 | orchestrator | 2026-01-06 02:14:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:36.582680 | orchestrator | 2026-01-06 02:14:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:36.582761 | orchestrator | 2026-01-06 02:14:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:39.630603 | orchestrator | 2026-01-06 02:14:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:39.633030 | orchestrator | 2026-01-06 02:14:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:39.633091 | orchestrator | 2026-01-06 02:14:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:42.682778 | orchestrator | 2026-01-06 02:14:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:42.685829 | orchestrator | 2026-01-06 02:14:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:42.685938 | orchestrator | 2026-01-06 02:14:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:45.734969 | orchestrator | 2026-01-06 02:14:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:45.736825 | orchestrator | 2026-01-06 02:14:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:45.736901 | orchestrator | 2026-01-06 02:14:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:48.785656 | orchestrator | 2026-01-06 02:14:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:48.786321 | orchestrator | 2026-01-06 02:14:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:48.786344 | orchestrator | 2026-01-06 02:14:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:51.835388 | orchestrator | 2026-01-06 02:14:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:51.841346 | orchestrator | 2026-01-06 02:14:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:51.841430 | orchestrator | 2026-01-06 02:14:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:54.887759 | orchestrator | 2026-01-06 02:14:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:54.889671 | orchestrator | 2026-01-06 02:14:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:54.889811 | orchestrator | 2026-01-06 02:14:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:14:57.941657 | orchestrator | 2026-01-06 02:14:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:14:57.942681 | orchestrator | 2026-01-06 02:14:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:14:57.943208 | orchestrator | 2026-01-06 02:14:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:00.993506 | orchestrator | 2026-01-06 02:15:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:00.994986 | orchestrator | 2026-01-06 02:15:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:00.995056 | orchestrator | 2026-01-06 02:15:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:04.040007 | orchestrator | 2026-01-06 02:15:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:04.040818 | orchestrator | 2026-01-06 02:15:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:04.040865 | orchestrator | 2026-01-06 02:15:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:07.094460 | orchestrator | 2026-01-06 02:15:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:07.095783 | orchestrator | 2026-01-06 02:15:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:07.095910 | orchestrator | 2026-01-06 02:15:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:10.143766 | orchestrator | 2026-01-06 02:15:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:10.145768 | orchestrator | 2026-01-06 02:15:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:10.145872 | orchestrator | 2026-01-06 02:15:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:13.193188 | orchestrator | 2026-01-06 02:15:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:13.195637 | orchestrator | 2026-01-06 02:15:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:13.195766 | orchestrator | 2026-01-06 02:15:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:16.248600 | orchestrator | 2026-01-06 02:15:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:16.249859 | orchestrator | 2026-01-06 02:15:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:16.249918 | orchestrator | 2026-01-06 02:15:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:19.292159 | orchestrator | 2026-01-06 02:15:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:19.294152 | orchestrator | 2026-01-06 02:15:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:19.294225 | orchestrator | 2026-01-06 02:15:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:22.342424 | orchestrator | 2026-01-06 02:15:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:22.345198 | orchestrator | 2026-01-06 02:15:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:22.345251 | orchestrator | 2026-01-06 02:15:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:25.388667 | orchestrator | 2026-01-06 02:15:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:25.390881 | orchestrator | 2026-01-06 02:15:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:25.390952 | orchestrator | 2026-01-06 02:15:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:28.437754 | orchestrator | 2026-01-06 02:15:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:28.438953 | orchestrator | 2026-01-06 02:15:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:28.439125 | orchestrator | 2026-01-06 02:15:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:31.494241 | orchestrator | 2026-01-06 02:15:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:31.495427 | orchestrator | 2026-01-06 02:15:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:31.495473 | orchestrator | 2026-01-06 02:15:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:34.544662 | orchestrator | 2026-01-06 02:15:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:34.547006 | orchestrator | 2026-01-06 02:15:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:34.547066 | orchestrator | 2026-01-06 02:15:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:37.605230 | orchestrator | 2026-01-06 02:15:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:37.607426 | orchestrator | 2026-01-06 02:15:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:37.607476 | orchestrator | 2026-01-06 02:15:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:40.660991 | orchestrator | 2026-01-06 02:15:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:40.663849 | orchestrator | 2026-01-06 02:15:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:40.663900 | orchestrator | 2026-01-06 02:15:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:43.713813 | orchestrator | 2026-01-06 02:15:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:43.716073 | orchestrator | 2026-01-06 02:15:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:43.716143 | orchestrator | 2026-01-06 02:15:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:46.764983 | orchestrator | 2026-01-06 02:15:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:46.767351 | orchestrator | 2026-01-06 02:15:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:46.767470 | orchestrator | 2026-01-06 02:15:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:49.813918 | orchestrator | 2026-01-06 02:15:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:49.817705 | orchestrator | 2026-01-06 02:15:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:49.817789 | orchestrator | 2026-01-06 02:15:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:52.865160 | orchestrator | 2026-01-06 02:15:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:52.866344 | orchestrator | 2026-01-06 02:15:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:52.866423 | orchestrator | 2026-01-06 02:15:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:55.917444 | orchestrator | 2026-01-06 02:15:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:55.919226 | orchestrator | 2026-01-06 02:15:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:55.919320 | orchestrator | 2026-01-06 02:15:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:15:58.964542 | orchestrator | 2026-01-06 02:15:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:15:58.966878 | orchestrator | 2026-01-06 02:15:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:15:58.967047 | orchestrator | 2026-01-06 02:15:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:02.020456 | orchestrator | 2026-01-06 02:16:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:02.021860 | orchestrator | 2026-01-06 02:16:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:02.021910 | orchestrator | 2026-01-06 02:16:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:05.070408 | orchestrator | 2026-01-06 02:16:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:05.072157 | orchestrator | 2026-01-06 02:16:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:05.073115 | orchestrator | 2026-01-06 02:16:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:08.118240 | orchestrator | 2026-01-06 02:16:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:08.119675 | orchestrator | 2026-01-06 02:16:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:08.119749 | orchestrator | 2026-01-06 02:16:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:11.164834 | orchestrator | 2026-01-06 02:16:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:11.166253 | orchestrator | 2026-01-06 02:16:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:11.166335 | orchestrator | 2026-01-06 02:16:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:14.215210 | orchestrator | 2026-01-06 02:16:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:14.217441 | orchestrator | 2026-01-06 02:16:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:14.217501 | orchestrator | 2026-01-06 02:16:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:17.264862 | orchestrator | 2026-01-06 02:16:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:17.267248 | orchestrator | 2026-01-06 02:16:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:17.267399 | orchestrator | 2026-01-06 02:16:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:20.314300 | orchestrator | 2026-01-06 02:16:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:20.316179 | orchestrator | 2026-01-06 02:16:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:20.316227 | orchestrator | 2026-01-06 02:16:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:23.367306 | orchestrator | 2026-01-06 02:16:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:23.368863 | orchestrator | 2026-01-06 02:16:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:23.368907 | orchestrator | 2026-01-06 02:16:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:26.408694 | orchestrator | 2026-01-06 02:16:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:26.410190 | orchestrator | 2026-01-06 02:16:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:26.410214 | orchestrator | 2026-01-06 02:16:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:29.455668 | orchestrator | 2026-01-06 02:16:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:29.457217 | orchestrator | 2026-01-06 02:16:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:29.457300 | orchestrator | 2026-01-06 02:16:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:32.500842 | orchestrator | 2026-01-06 02:16:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:32.503512 | orchestrator | 2026-01-06 02:16:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:32.503583 | orchestrator | 2026-01-06 02:16:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:35.552791 | orchestrator | 2026-01-06 02:16:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:35.555364 | orchestrator | 2026-01-06 02:16:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:35.555426 | orchestrator | 2026-01-06 02:16:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:38.602397 | orchestrator | 2026-01-06 02:16:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:38.604780 | orchestrator | 2026-01-06 02:16:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:38.604853 | orchestrator | 2026-01-06 02:16:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:41.653887 | orchestrator | 2026-01-06 02:16:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:41.655348 | orchestrator | 2026-01-06 02:16:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:41.655405 | orchestrator | 2026-01-06 02:16:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:44.700783 | orchestrator | 2026-01-06 02:16:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:44.704667 | orchestrator | 2026-01-06 02:16:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:44.705040 | orchestrator | 2026-01-06 02:16:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:47.753660 | orchestrator | 2026-01-06 02:16:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:47.755305 | orchestrator | 2026-01-06 02:16:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:47.755332 | orchestrator | 2026-01-06 02:16:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:50.805112 | orchestrator | 2026-01-06 02:16:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:50.806996 | orchestrator | 2026-01-06 02:16:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:50.807109 | orchestrator | 2026-01-06 02:16:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:53.854961 | orchestrator | 2026-01-06 02:16:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:53.856695 | orchestrator | 2026-01-06 02:16:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:53.856733 | orchestrator | 2026-01-06 02:16:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:56.908647 | orchestrator | 2026-01-06 02:16:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:56.911210 | orchestrator | 2026-01-06 02:16:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:56.911261 | orchestrator | 2026-01-06 02:16:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:16:59.956663 | orchestrator | 2026-01-06 02:16:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:16:59.957879 | orchestrator | 2026-01-06 02:16:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:16:59.958155 | orchestrator | 2026-01-06 02:16:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:03.009967 | orchestrator | 2026-01-06 02:17:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:03.012277 | orchestrator | 2026-01-06 02:17:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:03.012410 | orchestrator | 2026-01-06 02:17:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:06.063525 | orchestrator | 2026-01-06 02:17:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:06.064784 | orchestrator | 2026-01-06 02:17:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:06.065005 | orchestrator | 2026-01-06 02:17:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:09.116075 | orchestrator | 2026-01-06 02:17:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:09.117851 | orchestrator | 2026-01-06 02:17:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:09.117913 | orchestrator | 2026-01-06 02:17:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:12.168646 | orchestrator | 2026-01-06 02:17:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:12.171442 | orchestrator | 2026-01-06 02:17:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:12.171542 | orchestrator | 2026-01-06 02:17:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:15.220790 | orchestrator | 2026-01-06 02:17:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:15.221679 | orchestrator | 2026-01-06 02:17:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:15.221789 | orchestrator | 2026-01-06 02:17:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:18.270692 | orchestrator | 2026-01-06 02:17:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:18.275836 | orchestrator | 2026-01-06 02:17:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:18.275913 | orchestrator | 2026-01-06 02:17:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:21.317487 | orchestrator | 2026-01-06 02:17:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:21.319805 | orchestrator | 2026-01-06 02:17:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:21.319873 | orchestrator | 2026-01-06 02:17:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:24.367591 | orchestrator | 2026-01-06 02:17:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:24.370385 | orchestrator | 2026-01-06 02:17:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:24.370462 | orchestrator | 2026-01-06 02:17:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:27.410196 | orchestrator | 2026-01-06 02:17:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:27.411579 | orchestrator | 2026-01-06 02:17:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:27.411668 | orchestrator | 2026-01-06 02:17:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:30.455920 | orchestrator | 2026-01-06 02:17:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:30.459761 | orchestrator | 2026-01-06 02:17:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:30.459879 | orchestrator | 2026-01-06 02:17:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:33.515864 | orchestrator | 2026-01-06 02:17:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:33.517431 | orchestrator | 2026-01-06 02:17:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:33.517477 | orchestrator | 2026-01-06 02:17:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:36.567528 | orchestrator | 2026-01-06 02:17:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:36.569313 | orchestrator | 2026-01-06 02:17:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:36.569380 | orchestrator | 2026-01-06 02:17:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:39.616352 | orchestrator | 2026-01-06 02:17:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:39.617743 | orchestrator | 2026-01-06 02:17:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:39.617786 | orchestrator | 2026-01-06 02:17:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:42.665684 | orchestrator | 2026-01-06 02:17:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:42.667500 | orchestrator | 2026-01-06 02:17:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:42.667571 | orchestrator | 2026-01-06 02:17:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:45.714979 | orchestrator | 2026-01-06 02:17:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:45.717953 | orchestrator | 2026-01-06 02:17:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:45.718181 | orchestrator | 2026-01-06 02:17:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:48.764954 | orchestrator | 2026-01-06 02:17:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:48.766276 | orchestrator | 2026-01-06 02:17:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:48.766312 | orchestrator | 2026-01-06 02:17:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:51.810707 | orchestrator | 2026-01-06 02:17:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:51.812360 | orchestrator | 2026-01-06 02:17:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:51.812393 | orchestrator | 2026-01-06 02:17:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:54.857932 | orchestrator | 2026-01-06 02:17:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:54.860443 | orchestrator | 2026-01-06 02:17:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:54.860513 | orchestrator | 2026-01-06 02:17:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:17:57.902608 | orchestrator | 2026-01-06 02:17:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:17:57.904943 | orchestrator | 2026-01-06 02:17:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:17:57.905068 | orchestrator | 2026-01-06 02:17:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:00.956907 | orchestrator | 2026-01-06 02:18:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:00.958498 | orchestrator | 2026-01-06 02:18:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:00.958570 | orchestrator | 2026-01-06 02:18:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:04.019428 | orchestrator | 2026-01-06 02:18:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:04.025526 | orchestrator | 2026-01-06 02:18:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:04.025631 | orchestrator | 2026-01-06 02:18:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:07.077428 | orchestrator | 2026-01-06 02:18:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:07.078825 | orchestrator | 2026-01-06 02:18:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:07.078888 | orchestrator | 2026-01-06 02:18:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:10.131457 | orchestrator | 2026-01-06 02:18:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:10.134190 | orchestrator | 2026-01-06 02:18:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:10.134276 | orchestrator | 2026-01-06 02:18:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:13.178283 | orchestrator | 2026-01-06 02:18:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:13.179570 | orchestrator | 2026-01-06 02:18:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:13.179871 | orchestrator | 2026-01-06 02:18:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:16.233201 | orchestrator | 2026-01-06 02:18:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:16.236430 | orchestrator | 2026-01-06 02:18:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:16.236522 | orchestrator | 2026-01-06 02:18:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:19.293418 | orchestrator | 2026-01-06 02:18:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:19.294799 | orchestrator | 2026-01-06 02:18:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:19.295072 | orchestrator | 2026-01-06 02:18:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:22.340005 | orchestrator | 2026-01-06 02:18:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:22.340676 | orchestrator | 2026-01-06 02:18:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:22.340718 | orchestrator | 2026-01-06 02:18:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:25.387541 | orchestrator | 2026-01-06 02:18:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:25.391383 | orchestrator | 2026-01-06 02:18:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:25.391733 | orchestrator | 2026-01-06 02:18:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:28.437966 | orchestrator | 2026-01-06 02:18:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:28.440879 | orchestrator | 2026-01-06 02:18:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:28.440945 | orchestrator | 2026-01-06 02:18:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:31.488904 | orchestrator | 2026-01-06 02:18:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:31.490318 | orchestrator | 2026-01-06 02:18:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:31.490540 | orchestrator | 2026-01-06 02:18:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:34.534250 | orchestrator | 2026-01-06 02:18:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:34.537194 | orchestrator | 2026-01-06 02:18:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:34.537249 | orchestrator | 2026-01-06 02:18:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:37.589743 | orchestrator | 2026-01-06 02:18:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:37.591259 | orchestrator | 2026-01-06 02:18:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:37.591320 | orchestrator | 2026-01-06 02:18:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:40.642628 | orchestrator | 2026-01-06 02:18:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:40.644852 | orchestrator | 2026-01-06 02:18:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:40.644906 | orchestrator | 2026-01-06 02:18:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:43.693804 | orchestrator | 2026-01-06 02:18:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:43.694802 | orchestrator | 2026-01-06 02:18:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:43.694858 | orchestrator | 2026-01-06 02:18:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:46.743863 | orchestrator | 2026-01-06 02:18:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:46.745917 | orchestrator | 2026-01-06 02:18:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:46.745986 | orchestrator | 2026-01-06 02:18:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:49.798158 | orchestrator | 2026-01-06 02:18:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:49.799521 | orchestrator | 2026-01-06 02:18:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:49.799575 | orchestrator | 2026-01-06 02:18:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:52.847428 | orchestrator | 2026-01-06 02:18:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:52.849258 | orchestrator | 2026-01-06 02:18:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:52.849326 | orchestrator | 2026-01-06 02:18:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:55.894814 | orchestrator | 2026-01-06 02:18:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:55.896366 | orchestrator | 2026-01-06 02:18:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:55.896417 | orchestrator | 2026-01-06 02:18:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:18:58.943761 | orchestrator | 2026-01-06 02:18:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:18:58.945348 | orchestrator | 2026-01-06 02:18:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:18:58.945427 | orchestrator | 2026-01-06 02:18:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:01.993732 | orchestrator | 2026-01-06 02:19:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:01.996089 | orchestrator | 2026-01-06 02:19:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:01.996160 | orchestrator | 2026-01-06 02:19:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:05.058715 | orchestrator | 2026-01-06 02:19:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:05.061621 | orchestrator | 2026-01-06 02:19:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:05.061712 | orchestrator | 2026-01-06 02:19:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:08.108731 | orchestrator | 2026-01-06 02:19:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:08.112079 | orchestrator | 2026-01-06 02:19:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:08.112184 | orchestrator | 2026-01-06 02:19:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:11.158332 | orchestrator | 2026-01-06 02:19:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:11.159413 | orchestrator | 2026-01-06 02:19:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:11.159460 | orchestrator | 2026-01-06 02:19:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:14.204667 | orchestrator | 2026-01-06 02:19:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:14.204915 | orchestrator | 2026-01-06 02:19:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:14.205085 | orchestrator | 2026-01-06 02:19:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:17.253874 | orchestrator | 2026-01-06 02:19:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:17.255377 | orchestrator | 2026-01-06 02:19:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:17.255447 | orchestrator | 2026-01-06 02:19:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:20.302965 | orchestrator | 2026-01-06 02:19:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:20.305772 | orchestrator | 2026-01-06 02:19:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:20.306368 | orchestrator | 2026-01-06 02:19:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:23.351793 | orchestrator | 2026-01-06 02:19:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:23.354184 | orchestrator | 2026-01-06 02:19:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:23.354262 | orchestrator | 2026-01-06 02:19:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:26.399433 | orchestrator | 2026-01-06 02:19:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:26.401947 | orchestrator | 2026-01-06 02:19:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:26.402004 | orchestrator | 2026-01-06 02:19:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:29.447871 | orchestrator | 2026-01-06 02:19:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:29.449996 | orchestrator | 2026-01-06 02:19:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:29.450147 | orchestrator | 2026-01-06 02:19:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:32.495094 | orchestrator | 2026-01-06 02:19:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:32.497158 | orchestrator | 2026-01-06 02:19:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:32.497213 | orchestrator | 2026-01-06 02:19:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:35.542823 | orchestrator | 2026-01-06 02:19:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:35.544142 | orchestrator | 2026-01-06 02:19:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:35.544229 | orchestrator | 2026-01-06 02:19:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:38.596988 | orchestrator | 2026-01-06 02:19:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:38.600589 | orchestrator | 2026-01-06 02:19:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:38.600671 | orchestrator | 2026-01-06 02:19:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:41.646963 | orchestrator | 2026-01-06 02:19:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:41.648350 | orchestrator | 2026-01-06 02:19:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:41.648390 | orchestrator | 2026-01-06 02:19:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:44.691081 | orchestrator | 2026-01-06 02:19:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:44.691819 | orchestrator | 2026-01-06 02:19:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:44.692002 | orchestrator | 2026-01-06 02:19:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:47.737812 | orchestrator | 2026-01-06 02:19:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:47.739427 | orchestrator | 2026-01-06 02:19:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:47.739505 | orchestrator | 2026-01-06 02:19:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:50.786590 | orchestrator | 2026-01-06 02:19:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:50.788370 | orchestrator | 2026-01-06 02:19:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:50.788467 | orchestrator | 2026-01-06 02:19:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:53.836625 | orchestrator | 2026-01-06 02:19:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:53.838546 | orchestrator | 2026-01-06 02:19:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:53.838599 | orchestrator | 2026-01-06 02:19:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:56.887237 | orchestrator | 2026-01-06 02:19:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:56.889636 | orchestrator | 2026-01-06 02:19:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:56.889716 | orchestrator | 2026-01-06 02:19:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:19:59.936250 | orchestrator | 2026-01-06 02:19:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:19:59.939160 | orchestrator | 2026-01-06 02:19:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:19:59.939246 | orchestrator | 2026-01-06 02:19:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:02.992013 | orchestrator | 2026-01-06 02:20:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:02.993225 | orchestrator | 2026-01-06 02:20:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:02.993298 | orchestrator | 2026-01-06 02:20:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:06.050425 | orchestrator | 2026-01-06 02:20:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:06.052089 | orchestrator | 2026-01-06 02:20:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:06.052156 | orchestrator | 2026-01-06 02:20:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:09.100488 | orchestrator | 2026-01-06 02:20:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:09.103359 | orchestrator | 2026-01-06 02:20:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:09.103604 | orchestrator | 2026-01-06 02:20:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:12.153877 | orchestrator | 2026-01-06 02:20:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:12.155524 | orchestrator | 2026-01-06 02:20:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:12.155643 | orchestrator | 2026-01-06 02:20:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:15.200497 | orchestrator | 2026-01-06 02:20:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:15.202468 | orchestrator | 2026-01-06 02:20:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:15.202535 | orchestrator | 2026-01-06 02:20:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:18.248362 | orchestrator | 2026-01-06 02:20:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:18.249834 | orchestrator | 2026-01-06 02:20:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:18.250077 | orchestrator | 2026-01-06 02:20:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:21.295282 | orchestrator | 2026-01-06 02:20:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:21.298856 | orchestrator | 2026-01-06 02:20:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:21.298930 | orchestrator | 2026-01-06 02:20:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:24.342285 | orchestrator | 2026-01-06 02:20:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:24.344888 | orchestrator | 2026-01-06 02:20:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:24.344947 | orchestrator | 2026-01-06 02:20:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:27.387039 | orchestrator | 2026-01-06 02:20:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:27.388957 | orchestrator | 2026-01-06 02:20:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:27.389000 | orchestrator | 2026-01-06 02:20:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:30.443154 | orchestrator | 2026-01-06 02:20:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:30.445115 | orchestrator | 2026-01-06 02:20:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:30.445182 | orchestrator | 2026-01-06 02:20:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:33.492818 | orchestrator | 2026-01-06 02:20:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:33.493050 | orchestrator | 2026-01-06 02:20:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:33.493072 | orchestrator | 2026-01-06 02:20:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:36.540209 | orchestrator | 2026-01-06 02:20:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:36.541932 | orchestrator | 2026-01-06 02:20:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:36.542207 | orchestrator | 2026-01-06 02:20:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:39.596895 | orchestrator | 2026-01-06 02:20:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:39.599389 | orchestrator | 2026-01-06 02:20:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:39.599833 | orchestrator | 2026-01-06 02:20:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:42.644375 | orchestrator | 2026-01-06 02:20:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:42.646668 | orchestrator | 2026-01-06 02:20:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:42.646718 | orchestrator | 2026-01-06 02:20:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:45.690394 | orchestrator | 2026-01-06 02:20:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:45.692142 | orchestrator | 2026-01-06 02:20:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:45.692263 | orchestrator | 2026-01-06 02:20:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:48.738642 | orchestrator | 2026-01-06 02:20:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:48.740055 | orchestrator | 2026-01-06 02:20:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:48.740452 | orchestrator | 2026-01-06 02:20:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:51.791110 | orchestrator | 2026-01-06 02:20:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:51.793532 | orchestrator | 2026-01-06 02:20:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:51.793598 | orchestrator | 2026-01-06 02:20:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:54.841030 | orchestrator | 2026-01-06 02:20:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:54.842131 | orchestrator | 2026-01-06 02:20:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:54.842180 | orchestrator | 2026-01-06 02:20:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:20:57.895409 | orchestrator | 2026-01-06 02:20:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:20:57.896561 | orchestrator | 2026-01-06 02:20:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:20:57.896607 | orchestrator | 2026-01-06 02:20:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:00.942624 | orchestrator | 2026-01-06 02:21:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:00.944234 | orchestrator | 2026-01-06 02:21:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:00.944302 | orchestrator | 2026-01-06 02:21:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:03.986855 | orchestrator | 2026-01-06 02:21:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:03.987531 | orchestrator | 2026-01-06 02:21:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:03.987574 | orchestrator | 2026-01-06 02:21:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:07.046693 | orchestrator | 2026-01-06 02:21:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:07.047075 | orchestrator | 2026-01-06 02:21:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:07.047108 | orchestrator | 2026-01-06 02:21:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:10.092804 | orchestrator | 2026-01-06 02:21:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:10.094676 | orchestrator | 2026-01-06 02:21:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:10.094756 | orchestrator | 2026-01-06 02:21:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:13.142672 | orchestrator | 2026-01-06 02:21:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:13.144206 | orchestrator | 2026-01-06 02:21:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:13.144254 | orchestrator | 2026-01-06 02:21:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:16.189376 | orchestrator | 2026-01-06 02:21:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:16.190931 | orchestrator | 2026-01-06 02:21:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:16.190956 | orchestrator | 2026-01-06 02:21:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:19.235815 | orchestrator | 2026-01-06 02:21:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:19.237342 | orchestrator | 2026-01-06 02:21:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:19.237390 | orchestrator | 2026-01-06 02:21:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:22.282375 | orchestrator | 2026-01-06 02:21:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:22.285845 | orchestrator | 2026-01-06 02:21:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:22.285927 | orchestrator | 2026-01-06 02:21:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:25.331860 | orchestrator | 2026-01-06 02:21:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:25.335196 | orchestrator | 2026-01-06 02:21:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:25.335413 | orchestrator | 2026-01-06 02:21:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:28.379745 | orchestrator | 2026-01-06 02:21:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:28.383598 | orchestrator | 2026-01-06 02:21:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:28.383746 | orchestrator | 2026-01-06 02:21:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:31.435171 | orchestrator | 2026-01-06 02:21:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:31.436791 | orchestrator | 2026-01-06 02:21:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:31.436839 | orchestrator | 2026-01-06 02:21:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:34.486577 | orchestrator | 2026-01-06 02:21:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:34.490680 | orchestrator | 2026-01-06 02:21:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:34.490799 | orchestrator | 2026-01-06 02:21:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:37.536249 | orchestrator | 2026-01-06 02:21:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:37.538754 | orchestrator | 2026-01-06 02:21:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:37.538886 | orchestrator | 2026-01-06 02:21:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:40.583796 | orchestrator | 2026-01-06 02:21:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:40.585315 | orchestrator | 2026-01-06 02:21:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:40.585420 | orchestrator | 2026-01-06 02:21:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:43.631855 | orchestrator | 2026-01-06 02:21:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:43.633302 | orchestrator | 2026-01-06 02:21:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:43.633344 | orchestrator | 2026-01-06 02:21:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:46.680619 | orchestrator | 2026-01-06 02:21:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:46.681880 | orchestrator | 2026-01-06 02:21:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:46.681920 | orchestrator | 2026-01-06 02:21:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:49.731414 | orchestrator | 2026-01-06 02:21:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:49.732531 | orchestrator | 2026-01-06 02:21:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:49.732625 | orchestrator | 2026-01-06 02:21:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:52.784548 | orchestrator | 2026-01-06 02:21:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:52.890360 | orchestrator | 2026-01-06 02:21:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:52.890415 | orchestrator | 2026-01-06 02:21:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:55.840008 | orchestrator | 2026-01-06 02:21:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:55.841057 | orchestrator | 2026-01-06 02:21:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:55.841085 | orchestrator | 2026-01-06 02:21:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:21:58.889875 | orchestrator | 2026-01-06 02:21:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:21:58.890586 | orchestrator | 2026-01-06 02:21:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:21:58.890714 | orchestrator | 2026-01-06 02:21:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:01.942615 | orchestrator | 2026-01-06 02:22:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:01.944691 | orchestrator | 2026-01-06 02:22:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:01.944789 | orchestrator | 2026-01-06 02:22:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:04.992713 | orchestrator | 2026-01-06 02:22:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:04.995303 | orchestrator | 2026-01-06 02:22:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:04.995356 | orchestrator | 2026-01-06 02:22:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:08.042613 | orchestrator | 2026-01-06 02:22:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:08.043793 | orchestrator | 2026-01-06 02:22:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:08.043813 | orchestrator | 2026-01-06 02:22:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:11.090283 | orchestrator | 2026-01-06 02:22:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:11.092810 | orchestrator | 2026-01-06 02:22:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:11.092836 | orchestrator | 2026-01-06 02:22:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:14.133920 | orchestrator | 2026-01-06 02:22:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:14.136170 | orchestrator | 2026-01-06 02:22:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:14.136422 | orchestrator | 2026-01-06 02:22:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:17.184454 | orchestrator | 2026-01-06 02:22:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:17.186294 | orchestrator | 2026-01-06 02:22:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:17.186355 | orchestrator | 2026-01-06 02:22:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:20.240077 | orchestrator | 2026-01-06 02:22:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:20.243210 | orchestrator | 2026-01-06 02:22:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:20.243297 | orchestrator | 2026-01-06 02:22:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:23.296787 | orchestrator | 2026-01-06 02:22:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:23.299221 | orchestrator | 2026-01-06 02:22:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:23.299288 | orchestrator | 2026-01-06 02:22:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:26.344711 | orchestrator | 2026-01-06 02:22:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:26.347766 | orchestrator | 2026-01-06 02:22:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:26.347850 | orchestrator | 2026-01-06 02:22:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:29.398538 | orchestrator | 2026-01-06 02:22:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:29.400991 | orchestrator | 2026-01-06 02:22:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:29.401079 | orchestrator | 2026-01-06 02:22:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:32.448708 | orchestrator | 2026-01-06 02:22:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:32.450721 | orchestrator | 2026-01-06 02:22:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:32.450777 | orchestrator | 2026-01-06 02:22:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:35.494433 | orchestrator | 2026-01-06 02:22:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:35.496918 | orchestrator | 2026-01-06 02:22:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:35.496993 | orchestrator | 2026-01-06 02:22:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:38.540888 | orchestrator | 2026-01-06 02:22:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:38.542531 | orchestrator | 2026-01-06 02:22:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:38.542635 | orchestrator | 2026-01-06 02:22:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:41.595535 | orchestrator | 2026-01-06 02:22:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:41.597357 | orchestrator | 2026-01-06 02:22:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:41.597410 | orchestrator | 2026-01-06 02:22:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:44.642236 | orchestrator | 2026-01-06 02:22:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:44.644659 | orchestrator | 2026-01-06 02:22:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:44.644709 | orchestrator | 2026-01-06 02:22:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:47.693005 | orchestrator | 2026-01-06 02:22:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:47.694012 | orchestrator | 2026-01-06 02:22:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:47.694094 | orchestrator | 2026-01-06 02:22:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:50.744639 | orchestrator | 2026-01-06 02:22:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:50.747699 | orchestrator | 2026-01-06 02:22:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:50.747912 | orchestrator | 2026-01-06 02:22:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:53.797405 | orchestrator | 2026-01-06 02:22:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:53.798950 | orchestrator | 2026-01-06 02:22:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:53.799014 | orchestrator | 2026-01-06 02:22:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:56.847130 | orchestrator | 2026-01-06 02:22:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:56.848457 | orchestrator | 2026-01-06 02:22:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:56.848651 | orchestrator | 2026-01-06 02:22:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:22:59.898901 | orchestrator | 2026-01-06 02:22:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:22:59.900489 | orchestrator | 2026-01-06 02:22:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:22:59.900565 | orchestrator | 2026-01-06 02:22:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:02.942773 | orchestrator | 2026-01-06 02:23:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:02.943803 | orchestrator | 2026-01-06 02:23:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:02.943862 | orchestrator | 2026-01-06 02:23:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:05.990309 | orchestrator | 2026-01-06 02:23:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:05.991252 | orchestrator | 2026-01-06 02:23:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:05.991284 | orchestrator | 2026-01-06 02:23:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:09.033936 | orchestrator | 2026-01-06 02:23:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:09.036522 | orchestrator | 2026-01-06 02:23:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:09.036802 | orchestrator | 2026-01-06 02:23:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:12.088993 | orchestrator | 2026-01-06 02:23:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:12.089100 | orchestrator | 2026-01-06 02:23:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:12.089112 | orchestrator | 2026-01-06 02:23:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:15.129861 | orchestrator | 2026-01-06 02:23:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:15.131570 | orchestrator | 2026-01-06 02:23:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:15.131696 | orchestrator | 2026-01-06 02:23:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:18.184352 | orchestrator | 2026-01-06 02:23:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:18.186824 | orchestrator | 2026-01-06 02:23:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:18.186942 | orchestrator | 2026-01-06 02:23:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:21.239985 | orchestrator | 2026-01-06 02:23:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:21.242619 | orchestrator | 2026-01-06 02:23:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:21.242897 | orchestrator | 2026-01-06 02:23:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:24.292730 | orchestrator | 2026-01-06 02:23:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:24.294784 | orchestrator | 2026-01-06 02:23:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:24.294872 | orchestrator | 2026-01-06 02:23:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:27.341213 | orchestrator | 2026-01-06 02:23:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:27.343907 | orchestrator | 2026-01-06 02:23:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:27.343997 | orchestrator | 2026-01-06 02:23:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:30.395959 | orchestrator | 2026-01-06 02:23:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:30.397453 | orchestrator | 2026-01-06 02:23:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:30.397482 | orchestrator | 2026-01-06 02:23:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:33.447423 | orchestrator | 2026-01-06 02:23:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:33.449402 | orchestrator | 2026-01-06 02:23:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:33.450685 | orchestrator | 2026-01-06 02:23:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:36.492090 | orchestrator | 2026-01-06 02:23:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:36.493222 | orchestrator | 2026-01-06 02:23:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:36.493264 | orchestrator | 2026-01-06 02:23:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:39.539033 | orchestrator | 2026-01-06 02:23:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:39.540369 | orchestrator | 2026-01-06 02:23:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:39.540646 | orchestrator | 2026-01-06 02:23:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:42.592302 | orchestrator | 2026-01-06 02:23:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:42.594185 | orchestrator | 2026-01-06 02:23:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:42.594254 | orchestrator | 2026-01-06 02:23:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:45.639227 | orchestrator | 2026-01-06 02:23:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:45.642323 | orchestrator | 2026-01-06 02:23:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:45.642381 | orchestrator | 2026-01-06 02:23:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:48.695084 | orchestrator | 2026-01-06 02:23:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:48.697121 | orchestrator | 2026-01-06 02:23:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:48.697194 | orchestrator | 2026-01-06 02:23:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:51.742317 | orchestrator | 2026-01-06 02:23:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:51.743910 | orchestrator | 2026-01-06 02:23:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:51.744040 | orchestrator | 2026-01-06 02:23:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:54.790945 | orchestrator | 2026-01-06 02:23:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:54.792799 | orchestrator | 2026-01-06 02:23:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:54.792862 | orchestrator | 2026-01-06 02:23:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:23:57.838434 | orchestrator | 2026-01-06 02:23:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:23:57.840036 | orchestrator | 2026-01-06 02:23:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:23:57.840076 | orchestrator | 2026-01-06 02:23:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:00.882270 | orchestrator | 2026-01-06 02:24:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:00.883532 | orchestrator | 2026-01-06 02:24:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:00.883730 | orchestrator | 2026-01-06 02:24:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:03.933097 | orchestrator | 2026-01-06 02:24:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:03.934536 | orchestrator | 2026-01-06 02:24:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:03.934673 | orchestrator | 2026-01-06 02:24:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:06.981977 | orchestrator | 2026-01-06 02:24:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:06.983697 | orchestrator | 2026-01-06 02:24:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:06.983754 | orchestrator | 2026-01-06 02:24:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:10.035918 | orchestrator | 2026-01-06 02:24:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:10.038211 | orchestrator | 2026-01-06 02:24:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:10.038306 | orchestrator | 2026-01-06 02:24:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:13.086157 | orchestrator | 2026-01-06 02:24:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:13.086965 | orchestrator | 2026-01-06 02:24:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:13.087181 | orchestrator | 2026-01-06 02:24:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:16.132826 | orchestrator | 2026-01-06 02:24:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:16.135538 | orchestrator | 2026-01-06 02:24:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:16.135680 | orchestrator | 2026-01-06 02:24:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:19.186586 | orchestrator | 2026-01-06 02:24:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:19.188452 | orchestrator | 2026-01-06 02:24:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:19.188500 | orchestrator | 2026-01-06 02:24:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:22.235386 | orchestrator | 2026-01-06 02:24:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:22.236736 | orchestrator | 2026-01-06 02:24:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:22.236775 | orchestrator | 2026-01-06 02:24:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:25.287677 | orchestrator | 2026-01-06 02:24:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:25.289107 | orchestrator | 2026-01-06 02:24:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:25.289173 | orchestrator | 2026-01-06 02:24:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:28.336719 | orchestrator | 2026-01-06 02:24:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:28.339126 | orchestrator | 2026-01-06 02:24:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:28.339183 | orchestrator | 2026-01-06 02:24:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:31.391750 | orchestrator | 2026-01-06 02:24:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:31.393152 | orchestrator | 2026-01-06 02:24:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:31.393390 | orchestrator | 2026-01-06 02:24:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:34.445765 | orchestrator | 2026-01-06 02:24:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:34.447821 | orchestrator | 2026-01-06 02:24:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:34.447855 | orchestrator | 2026-01-06 02:24:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:37.493624 | orchestrator | 2026-01-06 02:24:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:37.495279 | orchestrator | 2026-01-06 02:24:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:37.495407 | orchestrator | 2026-01-06 02:24:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:40.544452 | orchestrator | 2026-01-06 02:24:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:40.546606 | orchestrator | 2026-01-06 02:24:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:40.546724 | orchestrator | 2026-01-06 02:24:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:43.594154 | orchestrator | 2026-01-06 02:24:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:43.594822 | orchestrator | 2026-01-06 02:24:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:43.595126 | orchestrator | 2026-01-06 02:24:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:46.643972 | orchestrator | 2026-01-06 02:24:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:46.645533 | orchestrator | 2026-01-06 02:24:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:46.645577 | orchestrator | 2026-01-06 02:24:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:49.692673 | orchestrator | 2026-01-06 02:24:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:49.694079 | orchestrator | 2026-01-06 02:24:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:49.694118 | orchestrator | 2026-01-06 02:24:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:52.738083 | orchestrator | 2026-01-06 02:24:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:52.740939 | orchestrator | 2026-01-06 02:24:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:52.741457 | orchestrator | 2026-01-06 02:24:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:55.785764 | orchestrator | 2026-01-06 02:24:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:55.786695 | orchestrator | 2026-01-06 02:24:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:55.786743 | orchestrator | 2026-01-06 02:24:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:24:58.837317 | orchestrator | 2026-01-06 02:24:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:24:58.838382 | orchestrator | 2026-01-06 02:24:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:24:58.838416 | orchestrator | 2026-01-06 02:24:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:01.886791 | orchestrator | 2026-01-06 02:25:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:01.887784 | orchestrator | 2026-01-06 02:25:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:01.887832 | orchestrator | 2026-01-06 02:25:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:04.939679 | orchestrator | 2026-01-06 02:25:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:04.940766 | orchestrator | 2026-01-06 02:25:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:04.940921 | orchestrator | 2026-01-06 02:25:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:07.990879 | orchestrator | 2026-01-06 02:25:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:07.993260 | orchestrator | 2026-01-06 02:25:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:07.993378 | orchestrator | 2026-01-06 02:25:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:11.047854 | orchestrator | 2026-01-06 02:25:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:11.051924 | orchestrator | 2026-01-06 02:25:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:11.052016 | orchestrator | 2026-01-06 02:25:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:14.099540 | orchestrator | 2026-01-06 02:25:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:14.101964 | orchestrator | 2026-01-06 02:25:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:14.102077 | orchestrator | 2026-01-06 02:25:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:17.149559 | orchestrator | 2026-01-06 02:25:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:17.151932 | orchestrator | 2026-01-06 02:25:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:17.152172 | orchestrator | 2026-01-06 02:25:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:20.199992 | orchestrator | 2026-01-06 02:25:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:20.201243 | orchestrator | 2026-01-06 02:25:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:20.201294 | orchestrator | 2026-01-06 02:25:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:23.250317 | orchestrator | 2026-01-06 02:25:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:23.250938 | orchestrator | 2026-01-06 02:25:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:23.250963 | orchestrator | 2026-01-06 02:25:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:26.292694 | orchestrator | 2026-01-06 02:25:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:26.294753 | orchestrator | 2026-01-06 02:25:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:26.295325 | orchestrator | 2026-01-06 02:25:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:29.342386 | orchestrator | 2026-01-06 02:25:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:29.343239 | orchestrator | 2026-01-06 02:25:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:29.343285 | orchestrator | 2026-01-06 02:25:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:32.391384 | orchestrator | 2026-01-06 02:25:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:32.391594 | orchestrator | 2026-01-06 02:25:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:32.391613 | orchestrator | 2026-01-06 02:25:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:35.442621 | orchestrator | 2026-01-06 02:25:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:35.444859 | orchestrator | 2026-01-06 02:25:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:35.444904 | orchestrator | 2026-01-06 02:25:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:38.489137 | orchestrator | 2026-01-06 02:25:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:38.490872 | orchestrator | 2026-01-06 02:25:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:38.491079 | orchestrator | 2026-01-06 02:25:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:41.536368 | orchestrator | 2026-01-06 02:25:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:41.538231 | orchestrator | 2026-01-06 02:25:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:41.538261 | orchestrator | 2026-01-06 02:25:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:44.582685 | orchestrator | 2026-01-06 02:25:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:44.583683 | orchestrator | 2026-01-06 02:25:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:44.583721 | orchestrator | 2026-01-06 02:25:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:47.629179 | orchestrator | 2026-01-06 02:25:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:47.630629 | orchestrator | 2026-01-06 02:25:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:47.630666 | orchestrator | 2026-01-06 02:25:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:50.684832 | orchestrator | 2026-01-06 02:25:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:50.686398 | orchestrator | 2026-01-06 02:25:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:50.686553 | orchestrator | 2026-01-06 02:25:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:53.738919 | orchestrator | 2026-01-06 02:25:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:53.742103 | orchestrator | 2026-01-06 02:25:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:53.742168 | orchestrator | 2026-01-06 02:25:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:56.794368 | orchestrator | 2026-01-06 02:25:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:56.797258 | orchestrator | 2026-01-06 02:25:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:56.797502 | orchestrator | 2026-01-06 02:25:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:25:59.851054 | orchestrator | 2026-01-06 02:25:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:25:59.853102 | orchestrator | 2026-01-06 02:25:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:25:59.853205 | orchestrator | 2026-01-06 02:25:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:02.906626 | orchestrator | 2026-01-06 02:26:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:02.908796 | orchestrator | 2026-01-06 02:26:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:02.908954 | orchestrator | 2026-01-06 02:26:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:05.966370 | orchestrator | 2026-01-06 02:26:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:05.967539 | orchestrator | 2026-01-06 02:26:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:05.967605 | orchestrator | 2026-01-06 02:26:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:09.015154 | orchestrator | 2026-01-06 02:26:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:09.016544 | orchestrator | 2026-01-06 02:26:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:09.016694 | orchestrator | 2026-01-06 02:26:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:12.065276 | orchestrator | 2026-01-06 02:26:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:12.068966 | orchestrator | 2026-01-06 02:26:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:12.069070 | orchestrator | 2026-01-06 02:26:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:15.109065 | orchestrator | 2026-01-06 02:26:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:15.111914 | orchestrator | 2026-01-06 02:26:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:15.112035 | orchestrator | 2026-01-06 02:26:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:18.154066 | orchestrator | 2026-01-06 02:26:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:18.154354 | orchestrator | 2026-01-06 02:26:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:18.154377 | orchestrator | 2026-01-06 02:26:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:21.201384 | orchestrator | 2026-01-06 02:26:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:21.203352 | orchestrator | 2026-01-06 02:26:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:21.203409 | orchestrator | 2026-01-06 02:26:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:24.251438 | orchestrator | 2026-01-06 02:26:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:24.253389 | orchestrator | 2026-01-06 02:26:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:24.253918 | orchestrator | 2026-01-06 02:26:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:27.301104 | orchestrator | 2026-01-06 02:26:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:27.303568 | orchestrator | 2026-01-06 02:26:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:27.303710 | orchestrator | 2026-01-06 02:26:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:30.350696 | orchestrator | 2026-01-06 02:26:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:30.352498 | orchestrator | 2026-01-06 02:26:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:30.352595 | orchestrator | 2026-01-06 02:26:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:33.401028 | orchestrator | 2026-01-06 02:26:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:33.402189 | orchestrator | 2026-01-06 02:26:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:33.402257 | orchestrator | 2026-01-06 02:26:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:36.445564 | orchestrator | 2026-01-06 02:26:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:36.448062 | orchestrator | 2026-01-06 02:26:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:36.448208 | orchestrator | 2026-01-06 02:26:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:39.494509 | orchestrator | 2026-01-06 02:26:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:39.497101 | orchestrator | 2026-01-06 02:26:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:39.497417 | orchestrator | 2026-01-06 02:26:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:42.542912 | orchestrator | 2026-01-06 02:26:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:42.544849 | orchestrator | 2026-01-06 02:26:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:42.544889 | orchestrator | 2026-01-06 02:26:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:45.590328 | orchestrator | 2026-01-06 02:26:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:45.592013 | orchestrator | 2026-01-06 02:26:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:45.592112 | orchestrator | 2026-01-06 02:26:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:48.638469 | orchestrator | 2026-01-06 02:26:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:48.640209 | orchestrator | 2026-01-06 02:26:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:48.640271 | orchestrator | 2026-01-06 02:26:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:51.692475 | orchestrator | 2026-01-06 02:26:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:51.694524 | orchestrator | 2026-01-06 02:26:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:51.694571 | orchestrator | 2026-01-06 02:26:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:54.742231 | orchestrator | 2026-01-06 02:26:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:54.745639 | orchestrator | 2026-01-06 02:26:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:54.745728 | orchestrator | 2026-01-06 02:26:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:26:57.793046 | orchestrator | 2026-01-06 02:26:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:26:57.794607 | orchestrator | 2026-01-06 02:26:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:26:57.794635 | orchestrator | 2026-01-06 02:26:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:00.847094 | orchestrator | 2026-01-06 02:27:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:00.848986 | orchestrator | 2026-01-06 02:27:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:00.849060 | orchestrator | 2026-01-06 02:27:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:03.898533 | orchestrator | 2026-01-06 02:27:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:03.900217 | orchestrator | 2026-01-06 02:27:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:03.900252 | orchestrator | 2026-01-06 02:27:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:06.946294 | orchestrator | 2026-01-06 02:27:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:06.948078 | orchestrator | 2026-01-06 02:27:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:06.948302 | orchestrator | 2026-01-06 02:27:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:09.991443 | orchestrator | 2026-01-06 02:27:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:09.992994 | orchestrator | 2026-01-06 02:27:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:09.993080 | orchestrator | 2026-01-06 02:27:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:13.043019 | orchestrator | 2026-01-06 02:27:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:13.044347 | orchestrator | 2026-01-06 02:27:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:13.044404 | orchestrator | 2026-01-06 02:27:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:16.084397 | orchestrator | 2026-01-06 02:27:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:16.086194 | orchestrator | 2026-01-06 02:27:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:16.086263 | orchestrator | 2026-01-06 02:27:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:19.128103 | orchestrator | 2026-01-06 02:27:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:19.130741 | orchestrator | 2026-01-06 02:27:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:19.130813 | orchestrator | 2026-01-06 02:27:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:22.186873 | orchestrator | 2026-01-06 02:27:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:22.188939 | orchestrator | 2026-01-06 02:27:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:22.189107 | orchestrator | 2026-01-06 02:27:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:25.232978 | orchestrator | 2026-01-06 02:27:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:25.233991 | orchestrator | 2026-01-06 02:27:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:25.234091 | orchestrator | 2026-01-06 02:27:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:28.279374 | orchestrator | 2026-01-06 02:27:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:28.281045 | orchestrator | 2026-01-06 02:27:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:28.281085 | orchestrator | 2026-01-06 02:27:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:31.326010 | orchestrator | 2026-01-06 02:27:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:31.328027 | orchestrator | 2026-01-06 02:27:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:31.328073 | orchestrator | 2026-01-06 02:27:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:34.371149 | orchestrator | 2026-01-06 02:27:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:34.373447 | orchestrator | 2026-01-06 02:27:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:34.373538 | orchestrator | 2026-01-06 02:27:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:37.422578 | orchestrator | 2026-01-06 02:27:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:37.423413 | orchestrator | 2026-01-06 02:27:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:37.423577 | orchestrator | 2026-01-06 02:27:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:40.471633 | orchestrator | 2026-01-06 02:27:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:40.473395 | orchestrator | 2026-01-06 02:27:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:40.473546 | orchestrator | 2026-01-06 02:27:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:43.522144 | orchestrator | 2026-01-06 02:27:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:43.525810 | orchestrator | 2026-01-06 02:27:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:43.525927 | orchestrator | 2026-01-06 02:27:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:46.571418 | orchestrator | 2026-01-06 02:27:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:46.573233 | orchestrator | 2026-01-06 02:27:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:46.573323 | orchestrator | 2026-01-06 02:27:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:49.626745 | orchestrator | 2026-01-06 02:27:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:49.627709 | orchestrator | 2026-01-06 02:27:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:49.627751 | orchestrator | 2026-01-06 02:27:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:52.679055 | orchestrator | 2026-01-06 02:27:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:52.680357 | orchestrator | 2026-01-06 02:27:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:52.680406 | orchestrator | 2026-01-06 02:27:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:55.730546 | orchestrator | 2026-01-06 02:27:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:55.732100 | orchestrator | 2026-01-06 02:27:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:55.732151 | orchestrator | 2026-01-06 02:27:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:27:58.787182 | orchestrator | 2026-01-06 02:27:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:27:58.789415 | orchestrator | 2026-01-06 02:27:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:27:58.789531 | orchestrator | 2026-01-06 02:27:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:01.837254 | orchestrator | 2026-01-06 02:28:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:01.839072 | orchestrator | 2026-01-06 02:28:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:01.839137 | orchestrator | 2026-01-06 02:28:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:04.890678 | orchestrator | 2026-01-06 02:28:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:04.891992 | orchestrator | 2026-01-06 02:28:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:04.892102 | orchestrator | 2026-01-06 02:28:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:07.938740 | orchestrator | 2026-01-06 02:28:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:07.940644 | orchestrator | 2026-01-06 02:28:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:07.940724 | orchestrator | 2026-01-06 02:28:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:10.991211 | orchestrator | 2026-01-06 02:28:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:10.992600 | orchestrator | 2026-01-06 02:28:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:10.992636 | orchestrator | 2026-01-06 02:28:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:14.037060 | orchestrator | 2026-01-06 02:28:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:14.038937 | orchestrator | 2026-01-06 02:28:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:14.039020 | orchestrator | 2026-01-06 02:28:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:17.085605 | orchestrator | 2026-01-06 02:28:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:17.087042 | orchestrator | 2026-01-06 02:28:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:17.087093 | orchestrator | 2026-01-06 02:28:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:20.134516 | orchestrator | 2026-01-06 02:28:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:20.136588 | orchestrator | 2026-01-06 02:28:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:20.136697 | orchestrator | 2026-01-06 02:28:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:23.189319 | orchestrator | 2026-01-06 02:28:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:23.189905 | orchestrator | 2026-01-06 02:28:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:23.189933 | orchestrator | 2026-01-06 02:28:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:26.239812 | orchestrator | 2026-01-06 02:28:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:26.242262 | orchestrator | 2026-01-06 02:28:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:26.242447 | orchestrator | 2026-01-06 02:28:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:29.291842 | orchestrator | 2026-01-06 02:28:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:29.293064 | orchestrator | 2026-01-06 02:28:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:29.293150 | orchestrator | 2026-01-06 02:28:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:32.343022 | orchestrator | 2026-01-06 02:28:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:32.343941 | orchestrator | 2026-01-06 02:28:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:32.343978 | orchestrator | 2026-01-06 02:28:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:35.388990 | orchestrator | 2026-01-06 02:28:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:35.390666 | orchestrator | 2026-01-06 02:28:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:35.390711 | orchestrator | 2026-01-06 02:28:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:38.433282 | orchestrator | 2026-01-06 02:28:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:38.435987 | orchestrator | 2026-01-06 02:28:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:38.436027 | orchestrator | 2026-01-06 02:28:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:41.481696 | orchestrator | 2026-01-06 02:28:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:41.482938 | orchestrator | 2026-01-06 02:28:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:41.482968 | orchestrator | 2026-01-06 02:28:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:44.536030 | orchestrator | 2026-01-06 02:28:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:44.537614 | orchestrator | 2026-01-06 02:28:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:44.537766 | orchestrator | 2026-01-06 02:28:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:47.590626 | orchestrator | 2026-01-06 02:28:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:47.591930 | orchestrator | 2026-01-06 02:28:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:47.591973 | orchestrator | 2026-01-06 02:28:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:50.644241 | orchestrator | 2026-01-06 02:28:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:50.647063 | orchestrator | 2026-01-06 02:28:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:50.647125 | orchestrator | 2026-01-06 02:28:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:53.696235 | orchestrator | 2026-01-06 02:28:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:53.697681 | orchestrator | 2026-01-06 02:28:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:53.698357 | orchestrator | 2026-01-06 02:28:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:56.744513 | orchestrator | 2026-01-06 02:28:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:56.747063 | orchestrator | 2026-01-06 02:28:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:56.747177 | orchestrator | 2026-01-06 02:28:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:28:59.799243 | orchestrator | 2026-01-06 02:28:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:28:59.801641 | orchestrator | 2026-01-06 02:28:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:28:59.801678 | orchestrator | 2026-01-06 02:28:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:02.849739 | orchestrator | 2026-01-06 02:29:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:02.851869 | orchestrator | 2026-01-06 02:29:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:02.851934 | orchestrator | 2026-01-06 02:29:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:05.899435 | orchestrator | 2026-01-06 02:29:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:05.901737 | orchestrator | 2026-01-06 02:29:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:05.901809 | orchestrator | 2026-01-06 02:29:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:08.949069 | orchestrator | 2026-01-06 02:29:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:08.950569 | orchestrator | 2026-01-06 02:29:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:08.950610 | orchestrator | 2026-01-06 02:29:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:11.996444 | orchestrator | 2026-01-06 02:29:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:11.998413 | orchestrator | 2026-01-06 02:29:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:11.998474 | orchestrator | 2026-01-06 02:29:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:15.056360 | orchestrator | 2026-01-06 02:29:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:15.059104 | orchestrator | 2026-01-06 02:29:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:15.059293 | orchestrator | 2026-01-06 02:29:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:18.105953 | orchestrator | 2026-01-06 02:29:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:18.107891 | orchestrator | 2026-01-06 02:29:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:18.107974 | orchestrator | 2026-01-06 02:29:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:21.149576 | orchestrator | 2026-01-06 02:29:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:21.151267 | orchestrator | 2026-01-06 02:29:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:21.151296 | orchestrator | 2026-01-06 02:29:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:24.201019 | orchestrator | 2026-01-06 02:29:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:24.203760 | orchestrator | 2026-01-06 02:29:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:24.203888 | orchestrator | 2026-01-06 02:29:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:27.250568 | orchestrator | 2026-01-06 02:29:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:27.252018 | orchestrator | 2026-01-06 02:29:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:27.252070 | orchestrator | 2026-01-06 02:29:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:30.303767 | orchestrator | 2026-01-06 02:29:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:30.306291 | orchestrator | 2026-01-06 02:29:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:30.306347 | orchestrator | 2026-01-06 02:29:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:33.346631 | orchestrator | 2026-01-06 02:29:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:33.348373 | orchestrator | 2026-01-06 02:29:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:33.348534 | orchestrator | 2026-01-06 02:29:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:36.396334 | orchestrator | 2026-01-06 02:29:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:36.398806 | orchestrator | 2026-01-06 02:29:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:36.398878 | orchestrator | 2026-01-06 02:29:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:39.455609 | orchestrator | 2026-01-06 02:29:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:39.457225 | orchestrator | 2026-01-06 02:29:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:39.457311 | orchestrator | 2026-01-06 02:29:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:42.501219 | orchestrator | 2026-01-06 02:29:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:42.504059 | orchestrator | 2026-01-06 02:29:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:42.504181 | orchestrator | 2026-01-06 02:29:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:45.557117 | orchestrator | 2026-01-06 02:29:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:45.559646 | orchestrator | 2026-01-06 02:29:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:45.559729 | orchestrator | 2026-01-06 02:29:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:48.606813 | orchestrator | 2026-01-06 02:29:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:48.610371 | orchestrator | 2026-01-06 02:29:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:48.610439 | orchestrator | 2026-01-06 02:29:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:51.660269 | orchestrator | 2026-01-06 02:29:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:51.663219 | orchestrator | 2026-01-06 02:29:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:51.663274 | orchestrator | 2026-01-06 02:29:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:54.709057 | orchestrator | 2026-01-06 02:29:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:54.710994 | orchestrator | 2026-01-06 02:29:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:54.711231 | orchestrator | 2026-01-06 02:29:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:29:57.763337 | orchestrator | 2026-01-06 02:29:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:29:57.764793 | orchestrator | 2026-01-06 02:29:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:29:57.765065 | orchestrator | 2026-01-06 02:29:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:00.810417 | orchestrator | 2026-01-06 02:30:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:00.811959 | orchestrator | 2026-01-06 02:30:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:00.812027 | orchestrator | 2026-01-06 02:30:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:03.861207 | orchestrator | 2026-01-06 02:30:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:03.862372 | orchestrator | 2026-01-06 02:30:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:03.862429 | orchestrator | 2026-01-06 02:30:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:06.912843 | orchestrator | 2026-01-06 02:30:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:06.914983 | orchestrator | 2026-01-06 02:30:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:06.915187 | orchestrator | 2026-01-06 02:30:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:09.961490 | orchestrator | 2026-01-06 02:30:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:09.963515 | orchestrator | 2026-01-06 02:30:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:09.963627 | orchestrator | 2026-01-06 02:30:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:13.013214 | orchestrator | 2026-01-06 02:30:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:13.013826 | orchestrator | 2026-01-06 02:30:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:13.013850 | orchestrator | 2026-01-06 02:30:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:16.062423 | orchestrator | 2026-01-06 02:30:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:16.064362 | orchestrator | 2026-01-06 02:30:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:16.064422 | orchestrator | 2026-01-06 02:30:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:19.107764 | orchestrator | 2026-01-06 02:30:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:19.108896 | orchestrator | 2026-01-06 02:30:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:19.108962 | orchestrator | 2026-01-06 02:30:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:22.154791 | orchestrator | 2026-01-06 02:30:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:22.156750 | orchestrator | 2026-01-06 02:30:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:22.156805 | orchestrator | 2026-01-06 02:30:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:25.203502 | orchestrator | 2026-01-06 02:30:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:25.204540 | orchestrator | 2026-01-06 02:30:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:25.204564 | orchestrator | 2026-01-06 02:30:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:28.255487 | orchestrator | 2026-01-06 02:30:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:28.255744 | orchestrator | 2026-01-06 02:30:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:28.255775 | orchestrator | 2026-01-06 02:30:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:31.298670 | orchestrator | 2026-01-06 02:30:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:31.300370 | orchestrator | 2026-01-06 02:30:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:31.300420 | orchestrator | 2026-01-06 02:30:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:34.349702 | orchestrator | 2026-01-06 02:30:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:34.351407 | orchestrator | 2026-01-06 02:30:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:34.351549 | orchestrator | 2026-01-06 02:30:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:37.401423 | orchestrator | 2026-01-06 02:30:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:37.402808 | orchestrator | 2026-01-06 02:30:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:37.402906 | orchestrator | 2026-01-06 02:30:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:40.455607 | orchestrator | 2026-01-06 02:30:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:40.457498 | orchestrator | 2026-01-06 02:30:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:40.457536 | orchestrator | 2026-01-06 02:30:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:43.510253 | orchestrator | 2026-01-06 02:30:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:43.511256 | orchestrator | 2026-01-06 02:30:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:43.511718 | orchestrator | 2026-01-06 02:30:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:46.564188 | orchestrator | 2026-01-06 02:30:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:46.564563 | orchestrator | 2026-01-06 02:30:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:46.564687 | orchestrator | 2026-01-06 02:30:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:49.612941 | orchestrator | 2026-01-06 02:30:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:49.614327 | orchestrator | 2026-01-06 02:30:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:49.614749 | orchestrator | 2026-01-06 02:30:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:52.657691 | orchestrator | 2026-01-06 02:30:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:52.657872 | orchestrator | 2026-01-06 02:30:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:52.657894 | orchestrator | 2026-01-06 02:30:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:55.705601 | orchestrator | 2026-01-06 02:30:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:55.707231 | orchestrator | 2026-01-06 02:30:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:55.707439 | orchestrator | 2026-01-06 02:30:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:30:58.751085 | orchestrator | 2026-01-06 02:30:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:30:58.752323 | orchestrator | 2026-01-06 02:30:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:30:58.752357 | orchestrator | 2026-01-06 02:30:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:01.800064 | orchestrator | 2026-01-06 02:31:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:01.802199 | orchestrator | 2026-01-06 02:31:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:01.802432 | orchestrator | 2026-01-06 02:31:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:04.837758 | orchestrator | 2026-01-06 02:31:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:04.840022 | orchestrator | 2026-01-06 02:31:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:04.840068 | orchestrator | 2026-01-06 02:31:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:07.884079 | orchestrator | 2026-01-06 02:31:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:07.885373 | orchestrator | 2026-01-06 02:31:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:07.885408 | orchestrator | 2026-01-06 02:31:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:10.929570 | orchestrator | 2026-01-06 02:31:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:10.931357 | orchestrator | 2026-01-06 02:31:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:10.931418 | orchestrator | 2026-01-06 02:31:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:13.977142 | orchestrator | 2026-01-06 02:31:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:13.978713 | orchestrator | 2026-01-06 02:31:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:13.978754 | orchestrator | 2026-01-06 02:31:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:17.031251 | orchestrator | 2026-01-06 02:31:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:17.032487 | orchestrator | 2026-01-06 02:31:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:17.032577 | orchestrator | 2026-01-06 02:31:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:20.074329 | orchestrator | 2026-01-06 02:31:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:20.075784 | orchestrator | 2026-01-06 02:31:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:20.075858 | orchestrator | 2026-01-06 02:31:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:23.122470 | orchestrator | 2026-01-06 02:31:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:23.124512 | orchestrator | 2026-01-06 02:31:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:23.124571 | orchestrator | 2026-01-06 02:31:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:26.169598 | orchestrator | 2026-01-06 02:31:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:26.171263 | orchestrator | 2026-01-06 02:31:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:26.171295 | orchestrator | 2026-01-06 02:31:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:29.214405 | orchestrator | 2026-01-06 02:31:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:29.215246 | orchestrator | 2026-01-06 02:31:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:29.215294 | orchestrator | 2026-01-06 02:31:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:32.267735 | orchestrator | 2026-01-06 02:31:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:32.269917 | orchestrator | 2026-01-06 02:31:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:32.269984 | orchestrator | 2026-01-06 02:31:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:35.315572 | orchestrator | 2026-01-06 02:31:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:35.317160 | orchestrator | 2026-01-06 02:31:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:35.317235 | orchestrator | 2026-01-06 02:31:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:38.365775 | orchestrator | 2026-01-06 02:31:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:38.367062 | orchestrator | 2026-01-06 02:31:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:38.367093 | orchestrator | 2026-01-06 02:31:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:41.418397 | orchestrator | 2026-01-06 02:31:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:41.418503 | orchestrator | 2026-01-06 02:31:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:41.418518 | orchestrator | 2026-01-06 02:31:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:44.460036 | orchestrator | 2026-01-06 02:31:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:44.461759 | orchestrator | 2026-01-06 02:31:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:44.461882 | orchestrator | 2026-01-06 02:31:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:47.502482 | orchestrator | 2026-01-06 02:31:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:47.504215 | orchestrator | 2026-01-06 02:31:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:47.504266 | orchestrator | 2026-01-06 02:31:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:50.548171 | orchestrator | 2026-01-06 02:31:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:50.550593 | orchestrator | 2026-01-06 02:31:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:50.550643 | orchestrator | 2026-01-06 02:31:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:53.597124 | orchestrator | 2026-01-06 02:31:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:53.598624 | orchestrator | 2026-01-06 02:31:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:53.598674 | orchestrator | 2026-01-06 02:31:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:56.645642 | orchestrator | 2026-01-06 02:31:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:56.648248 | orchestrator | 2026-01-06 02:31:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:56.648314 | orchestrator | 2026-01-06 02:31:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:31:59.698215 | orchestrator | 2026-01-06 02:31:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:31:59.701000 | orchestrator | 2026-01-06 02:31:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:31:59.701050 | orchestrator | 2026-01-06 02:31:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:02.752947 | orchestrator | 2026-01-06 02:32:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:02.756952 | orchestrator | 2026-01-06 02:32:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:02.757137 | orchestrator | 2026-01-06 02:32:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:05.801516 | orchestrator | 2026-01-06 02:32:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:05.804482 | orchestrator | 2026-01-06 02:32:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:05.804558 | orchestrator | 2026-01-06 02:32:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:08.846365 | orchestrator | 2026-01-06 02:32:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:08.847473 | orchestrator | 2026-01-06 02:32:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:08.847938 | orchestrator | 2026-01-06 02:32:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:11.896519 | orchestrator | 2026-01-06 02:32:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:11.898899 | orchestrator | 2026-01-06 02:32:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:11.898961 | orchestrator | 2026-01-06 02:32:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:14.945600 | orchestrator | 2026-01-06 02:32:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:14.947157 | orchestrator | 2026-01-06 02:32:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:14.947212 | orchestrator | 2026-01-06 02:32:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:17.992987 | orchestrator | 2026-01-06 02:32:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:17.994658 | orchestrator | 2026-01-06 02:32:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:17.994850 | orchestrator | 2026-01-06 02:32:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:21.049884 | orchestrator | 2026-01-06 02:32:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:21.051374 | orchestrator | 2026-01-06 02:32:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:21.051394 | orchestrator | 2026-01-06 02:32:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:24.101920 | orchestrator | 2026-01-06 02:32:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:24.102159 | orchestrator | 2026-01-06 02:32:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:24.102181 | orchestrator | 2026-01-06 02:32:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:27.144899 | orchestrator | 2026-01-06 02:32:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:27.147479 | orchestrator | 2026-01-06 02:32:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:27.147713 | orchestrator | 2026-01-06 02:32:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:30.194770 | orchestrator | 2026-01-06 02:32:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:30.196656 | orchestrator | 2026-01-06 02:32:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:30.196803 | orchestrator | 2026-01-06 02:32:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:33.242430 | orchestrator | 2026-01-06 02:32:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:33.243668 | orchestrator | 2026-01-06 02:32:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:33.243773 | orchestrator | 2026-01-06 02:32:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:36.287938 | orchestrator | 2026-01-06 02:32:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:36.289006 | orchestrator | 2026-01-06 02:32:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:36.289026 | orchestrator | 2026-01-06 02:32:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:39.330533 | orchestrator | 2026-01-06 02:32:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:39.333019 | orchestrator | 2026-01-06 02:32:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:39.333078 | orchestrator | 2026-01-06 02:32:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:42.380120 | orchestrator | 2026-01-06 02:32:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:42.381837 | orchestrator | 2026-01-06 02:32:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:42.381974 | orchestrator | 2026-01-06 02:32:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:45.421470 | orchestrator | 2026-01-06 02:32:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:45.422932 | orchestrator | 2026-01-06 02:32:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:45.422962 | orchestrator | 2026-01-06 02:32:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:48.476301 | orchestrator | 2026-01-06 02:32:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:48.478372 | orchestrator | 2026-01-06 02:32:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:48.478458 | orchestrator | 2026-01-06 02:32:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:51.527815 | orchestrator | 2026-01-06 02:32:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:51.530128 | orchestrator | 2026-01-06 02:32:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:51.530259 | orchestrator | 2026-01-06 02:32:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:54.579332 | orchestrator | 2026-01-06 02:32:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:54.582195 | orchestrator | 2026-01-06 02:32:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:54.582350 | orchestrator | 2026-01-06 02:32:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:32:57.625858 | orchestrator | 2026-01-06 02:32:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:32:57.628204 | orchestrator | 2026-01-06 02:32:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:32:57.628261 | orchestrator | 2026-01-06 02:32:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:00.675807 | orchestrator | 2026-01-06 02:33:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:00.678273 | orchestrator | 2026-01-06 02:33:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:00.678328 | orchestrator | 2026-01-06 02:33:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:03.720333 | orchestrator | 2026-01-06 02:33:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:03.721462 | orchestrator | 2026-01-06 02:33:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:03.721543 | orchestrator | 2026-01-06 02:33:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:06.768035 | orchestrator | 2026-01-06 02:33:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:06.769835 | orchestrator | 2026-01-06 02:33:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:06.769906 | orchestrator | 2026-01-06 02:33:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:09.816877 | orchestrator | 2026-01-06 02:33:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:09.818194 | orchestrator | 2026-01-06 02:33:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:09.818260 | orchestrator | 2026-01-06 02:33:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:12.863370 | orchestrator | 2026-01-06 02:33:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:12.865824 | orchestrator | 2026-01-06 02:33:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:12.865965 | orchestrator | 2026-01-06 02:33:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:15.917524 | orchestrator | 2026-01-06 02:33:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:15.919834 | orchestrator | 2026-01-06 02:33:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:15.919914 | orchestrator | 2026-01-06 02:33:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:18.967362 | orchestrator | 2026-01-06 02:33:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:18.968753 | orchestrator | 2026-01-06 02:33:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:18.968798 | orchestrator | 2026-01-06 02:33:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:22.018168 | orchestrator | 2026-01-06 02:33:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:22.020281 | orchestrator | 2026-01-06 02:33:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:22.020380 | orchestrator | 2026-01-06 02:33:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:25.064265 | orchestrator | 2026-01-06 02:33:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:25.065805 | orchestrator | 2026-01-06 02:33:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:25.065948 | orchestrator | 2026-01-06 02:33:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:28.108801 | orchestrator | 2026-01-06 02:33:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:28.111128 | orchestrator | 2026-01-06 02:33:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:28.111195 | orchestrator | 2026-01-06 02:33:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:31.155380 | orchestrator | 2026-01-06 02:33:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:31.157572 | orchestrator | 2026-01-06 02:33:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:31.157675 | orchestrator | 2026-01-06 02:33:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:34.204185 | orchestrator | 2026-01-06 02:33:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:34.204519 | orchestrator | 2026-01-06 02:33:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:34.205231 | orchestrator | 2026-01-06 02:33:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:37.250396 | orchestrator | 2026-01-06 02:33:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:37.252671 | orchestrator | 2026-01-06 02:33:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:37.252759 | orchestrator | 2026-01-06 02:33:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:40.300147 | orchestrator | 2026-01-06 02:33:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:40.301081 | orchestrator | 2026-01-06 02:33:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:40.301264 | orchestrator | 2026-01-06 02:33:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:43.348408 | orchestrator | 2026-01-06 02:33:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:43.350520 | orchestrator | 2026-01-06 02:33:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:43.350793 | orchestrator | 2026-01-06 02:33:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:46.398328 | orchestrator | 2026-01-06 02:33:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:46.400240 | orchestrator | 2026-01-06 02:33:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:46.400432 | orchestrator | 2026-01-06 02:33:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:49.450670 | orchestrator | 2026-01-06 02:33:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:49.451429 | orchestrator | 2026-01-06 02:33:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:49.451672 | orchestrator | 2026-01-06 02:33:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:52.502095 | orchestrator | 2026-01-06 02:33:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:52.504167 | orchestrator | 2026-01-06 02:33:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:52.504191 | orchestrator | 2026-01-06 02:33:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:55.556451 | orchestrator | 2026-01-06 02:33:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:55.558881 | orchestrator | 2026-01-06 02:33:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:55.559085 | orchestrator | 2026-01-06 02:33:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:33:58.604149 | orchestrator | 2026-01-06 02:33:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:33:58.605705 | orchestrator | 2026-01-06 02:33:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:33:58.606202 | orchestrator | 2026-01-06 02:33:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:01.652334 | orchestrator | 2026-01-06 02:34:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:01.653218 | orchestrator | 2026-01-06 02:34:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:01.653263 | orchestrator | 2026-01-06 02:34:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:04.707078 | orchestrator | 2026-01-06 02:34:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:04.708787 | orchestrator | 2026-01-06 02:34:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:04.708840 | orchestrator | 2026-01-06 02:34:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:07.759171 | orchestrator | 2026-01-06 02:34:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:07.760995 | orchestrator | 2026-01-06 02:34:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:07.761045 | orchestrator | 2026-01-06 02:34:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:10.813186 | orchestrator | 2026-01-06 02:34:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:10.815481 | orchestrator | 2026-01-06 02:34:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:10.815627 | orchestrator | 2026-01-06 02:34:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:13.865158 | orchestrator | 2026-01-06 02:34:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:13.866310 | orchestrator | 2026-01-06 02:34:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:13.866370 | orchestrator | 2026-01-06 02:34:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:16.915117 | orchestrator | 2026-01-06 02:34:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:16.916109 | orchestrator | 2026-01-06 02:34:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:16.916174 | orchestrator | 2026-01-06 02:34:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:19.957028 | orchestrator | 2026-01-06 02:34:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:19.958715 | orchestrator | 2026-01-06 02:34:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:19.958772 | orchestrator | 2026-01-06 02:34:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:23.007354 | orchestrator | 2026-01-06 02:34:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:23.008998 | orchestrator | 2026-01-06 02:34:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:23.009085 | orchestrator | 2026-01-06 02:34:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:26.067127 | orchestrator | 2026-01-06 02:34:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:26.069321 | orchestrator | 2026-01-06 02:34:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:26.069384 | orchestrator | 2026-01-06 02:34:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:29.122286 | orchestrator | 2026-01-06 02:34:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:29.123741 | orchestrator | 2026-01-06 02:34:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:29.123938 | orchestrator | 2026-01-06 02:34:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:32.172636 | orchestrator | 2026-01-06 02:34:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:32.176149 | orchestrator | 2026-01-06 02:34:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:32.176376 | orchestrator | 2026-01-06 02:34:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:35.224341 | orchestrator | 2026-01-06 02:34:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:35.226601 | orchestrator | 2026-01-06 02:34:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:35.226678 | orchestrator | 2026-01-06 02:34:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:38.276651 | orchestrator | 2026-01-06 02:34:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:38.278517 | orchestrator | 2026-01-06 02:34:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:38.278640 | orchestrator | 2026-01-06 02:34:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:41.321917 | orchestrator | 2026-01-06 02:34:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:41.323303 | orchestrator | 2026-01-06 02:34:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:41.323494 | orchestrator | 2026-01-06 02:34:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:44.370876 | orchestrator | 2026-01-06 02:34:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:44.373206 | orchestrator | 2026-01-06 02:34:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:44.373275 | orchestrator | 2026-01-06 02:34:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:47.426276 | orchestrator | 2026-01-06 02:34:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:47.429335 | orchestrator | 2026-01-06 02:34:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:47.429646 | orchestrator | 2026-01-06 02:34:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:50.483480 | orchestrator | 2026-01-06 02:34:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:50.485883 | orchestrator | 2026-01-06 02:34:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:50.486105 | orchestrator | 2026-01-06 02:34:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:53.534688 | orchestrator | 2026-01-06 02:34:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:53.536404 | orchestrator | 2026-01-06 02:34:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:53.536708 | orchestrator | 2026-01-06 02:34:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:56.580485 | orchestrator | 2026-01-06 02:34:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:56.582450 | orchestrator | 2026-01-06 02:34:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:56.582505 | orchestrator | 2026-01-06 02:34:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:34:59.632223 | orchestrator | 2026-01-06 02:34:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:34:59.633507 | orchestrator | 2026-01-06 02:34:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:34:59.633570 | orchestrator | 2026-01-06 02:34:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:02.680496 | orchestrator | 2026-01-06 02:35:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:02.682309 | orchestrator | 2026-01-06 02:35:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:02.682372 | orchestrator | 2026-01-06 02:35:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:05.733357 | orchestrator | 2026-01-06 02:35:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:05.734634 | orchestrator | 2026-01-06 02:35:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:05.734733 | orchestrator | 2026-01-06 02:35:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:08.781240 | orchestrator | 2026-01-06 02:35:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:08.783707 | orchestrator | 2026-01-06 02:35:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:08.783734 | orchestrator | 2026-01-06 02:35:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:11.828833 | orchestrator | 2026-01-06 02:35:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:11.830420 | orchestrator | 2026-01-06 02:35:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:11.830500 | orchestrator | 2026-01-06 02:35:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:14.878823 | orchestrator | 2026-01-06 02:35:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:14.881215 | orchestrator | 2026-01-06 02:35:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:14.881420 | orchestrator | 2026-01-06 02:35:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:17.932000 | orchestrator | 2026-01-06 02:35:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:17.933737 | orchestrator | 2026-01-06 02:35:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:17.933784 | orchestrator | 2026-01-06 02:35:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:20.976816 | orchestrator | 2026-01-06 02:35:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:20.980040 | orchestrator | 2026-01-06 02:35:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:20.981036 | orchestrator | 2026-01-06 02:35:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:24.031580 | orchestrator | 2026-01-06 02:35:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:24.032211 | orchestrator | 2026-01-06 02:35:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:24.032265 | orchestrator | 2026-01-06 02:35:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:27.080563 | orchestrator | 2026-01-06 02:35:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:27.082263 | orchestrator | 2026-01-06 02:35:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:27.082500 | orchestrator | 2026-01-06 02:35:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:30.125326 | orchestrator | 2026-01-06 02:35:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:30.126764 | orchestrator | 2026-01-06 02:35:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:30.126796 | orchestrator | 2026-01-06 02:35:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:33.178143 | orchestrator | 2026-01-06 02:35:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:33.179810 | orchestrator | 2026-01-06 02:35:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:33.179878 | orchestrator | 2026-01-06 02:35:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:36.222601 | orchestrator | 2026-01-06 02:35:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:36.223832 | orchestrator | 2026-01-06 02:35:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:36.224027 | orchestrator | 2026-01-06 02:35:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:39.270872 | orchestrator | 2026-01-06 02:35:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:39.272331 | orchestrator | 2026-01-06 02:35:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:39.272441 | orchestrator | 2026-01-06 02:35:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:42.320045 | orchestrator | 2026-01-06 02:35:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:42.322160 | orchestrator | 2026-01-06 02:35:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:42.322241 | orchestrator | 2026-01-06 02:35:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:45.364398 | orchestrator | 2026-01-06 02:35:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:45.365970 | orchestrator | 2026-01-06 02:35:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:45.366012 | orchestrator | 2026-01-06 02:35:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:48.411493 | orchestrator | 2026-01-06 02:35:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:48.414308 | orchestrator | 2026-01-06 02:35:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:48.414402 | orchestrator | 2026-01-06 02:35:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:51.461038 | orchestrator | 2026-01-06 02:35:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:51.463014 | orchestrator | 2026-01-06 02:35:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:51.463215 | orchestrator | 2026-01-06 02:35:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:54.515023 | orchestrator | 2026-01-06 02:35:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:54.517009 | orchestrator | 2026-01-06 02:35:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:54.517094 | orchestrator | 2026-01-06 02:35:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:35:57.567401 | orchestrator | 2026-01-06 02:35:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:35:57.570210 | orchestrator | 2026-01-06 02:35:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:35:57.570317 | orchestrator | 2026-01-06 02:35:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:00.616181 | orchestrator | 2026-01-06 02:36:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:00.617242 | orchestrator | 2026-01-06 02:36:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:00.617725 | orchestrator | 2026-01-06 02:36:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:03.659194 | orchestrator | 2026-01-06 02:36:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:03.659989 | orchestrator | 2026-01-06 02:36:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:03.660185 | orchestrator | 2026-01-06 02:36:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:06.705470 | orchestrator | 2026-01-06 02:36:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:06.707124 | orchestrator | 2026-01-06 02:36:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:06.707173 | orchestrator | 2026-01-06 02:36:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:09.755268 | orchestrator | 2026-01-06 02:36:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:09.756777 | orchestrator | 2026-01-06 02:36:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:09.756832 | orchestrator | 2026-01-06 02:36:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:12.803032 | orchestrator | 2026-01-06 02:36:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:12.804777 | orchestrator | 2026-01-06 02:36:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:12.804819 | orchestrator | 2026-01-06 02:36:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:15.851162 | orchestrator | 2026-01-06 02:36:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:15.853186 | orchestrator | 2026-01-06 02:36:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:15.853322 | orchestrator | 2026-01-06 02:36:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:18.903805 | orchestrator | 2026-01-06 02:36:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:18.905460 | orchestrator | 2026-01-06 02:36:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:18.905546 | orchestrator | 2026-01-06 02:36:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:21.953354 | orchestrator | 2026-01-06 02:36:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:21.955048 | orchestrator | 2026-01-06 02:36:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:21.955179 | orchestrator | 2026-01-06 02:36:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:25.002689 | orchestrator | 2026-01-06 02:36:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:25.004108 | orchestrator | 2026-01-06 02:36:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:25.004190 | orchestrator | 2026-01-06 02:36:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:28.047713 | orchestrator | 2026-01-06 02:36:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:28.049308 | orchestrator | 2026-01-06 02:36:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:28.049388 | orchestrator | 2026-01-06 02:36:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:31.093683 | orchestrator | 2026-01-06 02:36:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:31.095473 | orchestrator | 2026-01-06 02:36:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:31.095520 | orchestrator | 2026-01-06 02:36:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:34.141077 | orchestrator | 2026-01-06 02:36:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:34.142093 | orchestrator | 2026-01-06 02:36:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:34.142197 | orchestrator | 2026-01-06 02:36:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:37.190145 | orchestrator | 2026-01-06 02:36:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:37.191666 | orchestrator | 2026-01-06 02:36:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:37.191833 | orchestrator | 2026-01-06 02:36:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:40.238391 | orchestrator | 2026-01-06 02:36:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:40.241461 | orchestrator | 2026-01-06 02:36:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:40.241607 | orchestrator | 2026-01-06 02:36:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:43.289444 | orchestrator | 2026-01-06 02:36:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:43.290982 | orchestrator | 2026-01-06 02:36:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:43.291017 | orchestrator | 2026-01-06 02:36:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:46.338896 | orchestrator | 2026-01-06 02:36:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:46.340370 | orchestrator | 2026-01-06 02:36:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:46.340404 | orchestrator | 2026-01-06 02:36:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:49.385837 | orchestrator | 2026-01-06 02:36:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:49.387748 | orchestrator | 2026-01-06 02:36:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:49.388044 | orchestrator | 2026-01-06 02:36:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:52.433054 | orchestrator | 2026-01-06 02:36:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:52.434535 | orchestrator | 2026-01-06 02:36:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:52.434592 | orchestrator | 2026-01-06 02:36:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:55.481613 | orchestrator | 2026-01-06 02:36:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:55.484088 | orchestrator | 2026-01-06 02:36:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:55.484142 | orchestrator | 2026-01-06 02:36:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:36:58.533792 | orchestrator | 2026-01-06 02:36:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:36:58.536857 | orchestrator | 2026-01-06 02:36:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:36:58.536921 | orchestrator | 2026-01-06 02:36:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:01.580978 | orchestrator | 2026-01-06 02:37:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:01.583361 | orchestrator | 2026-01-06 02:37:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:01.583512 | orchestrator | 2026-01-06 02:37:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:04.631318 | orchestrator | 2026-01-06 02:37:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:04.633380 | orchestrator | 2026-01-06 02:37:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:04.633416 | orchestrator | 2026-01-06 02:37:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:07.679270 | orchestrator | 2026-01-06 02:37:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:07.680624 | orchestrator | 2026-01-06 02:37:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:07.680668 | orchestrator | 2026-01-06 02:37:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:10.737823 | orchestrator | 2026-01-06 02:37:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:10.740770 | orchestrator | 2026-01-06 02:37:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:10.740854 | orchestrator | 2026-01-06 02:37:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:13.786479 | orchestrator | 2026-01-06 02:37:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:13.788364 | orchestrator | 2026-01-06 02:37:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:13.788442 | orchestrator | 2026-01-06 02:37:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:16.835899 | orchestrator | 2026-01-06 02:37:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:16.836749 | orchestrator | 2026-01-06 02:37:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:16.836786 | orchestrator | 2026-01-06 02:37:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:19.887004 | orchestrator | 2026-01-06 02:37:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:19.888531 | orchestrator | 2026-01-06 02:37:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:19.888990 | orchestrator | 2026-01-06 02:37:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:22.936934 | orchestrator | 2026-01-06 02:37:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:22.937281 | orchestrator | 2026-01-06 02:37:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:22.937326 | orchestrator | 2026-01-06 02:37:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:25.989087 | orchestrator | 2026-01-06 02:37:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:25.991004 | orchestrator | 2026-01-06 02:37:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:25.991081 | orchestrator | 2026-01-06 02:37:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:29.038792 | orchestrator | 2026-01-06 02:37:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:29.040156 | orchestrator | 2026-01-06 02:37:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:29.040214 | orchestrator | 2026-01-06 02:37:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:32.087008 | orchestrator | 2026-01-06 02:37:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:32.087233 | orchestrator | 2026-01-06 02:37:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:32.087250 | orchestrator | 2026-01-06 02:37:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:35.130866 | orchestrator | 2026-01-06 02:37:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:35.130986 | orchestrator | 2026-01-06 02:37:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:35.130995 | orchestrator | 2026-01-06 02:37:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:38.175950 | orchestrator | 2026-01-06 02:37:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:38.177001 | orchestrator | 2026-01-06 02:37:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:38.177291 | orchestrator | 2026-01-06 02:37:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:41.222535 | orchestrator | 2026-01-06 02:37:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:41.223583 | orchestrator | 2026-01-06 02:37:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:41.223645 | orchestrator | 2026-01-06 02:37:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:44.273913 | orchestrator | 2026-01-06 02:37:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:44.275693 | orchestrator | 2026-01-06 02:37:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:44.275766 | orchestrator | 2026-01-06 02:37:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:47.327358 | orchestrator | 2026-01-06 02:37:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:47.329295 | orchestrator | 2026-01-06 02:37:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:47.329327 | orchestrator | 2026-01-06 02:37:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:50.376533 | orchestrator | 2026-01-06 02:37:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:50.378909 | orchestrator | 2026-01-06 02:37:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:50.379324 | orchestrator | 2026-01-06 02:37:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:53.431933 | orchestrator | 2026-01-06 02:37:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:53.434565 | orchestrator | 2026-01-06 02:37:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:53.435235 | orchestrator | 2026-01-06 02:37:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:56.482560 | orchestrator | 2026-01-06 02:37:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:56.484301 | orchestrator | 2026-01-06 02:37:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:56.484328 | orchestrator | 2026-01-06 02:37:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:37:59.534930 | orchestrator | 2026-01-06 02:37:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:37:59.536286 | orchestrator | 2026-01-06 02:37:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:37:59.536350 | orchestrator | 2026-01-06 02:37:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:02.586203 | orchestrator | 2026-01-06 02:38:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:02.588546 | orchestrator | 2026-01-06 02:38:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:02.588656 | orchestrator | 2026-01-06 02:38:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:05.640347 | orchestrator | 2026-01-06 02:38:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:05.640724 | orchestrator | 2026-01-06 02:38:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:05.640809 | orchestrator | 2026-01-06 02:38:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:08.686444 | orchestrator | 2026-01-06 02:38:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:08.687605 | orchestrator | 2026-01-06 02:38:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:08.687758 | orchestrator | 2026-01-06 02:38:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:11.726686 | orchestrator | 2026-01-06 02:38:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:11.728196 | orchestrator | 2026-01-06 02:38:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:11.728272 | orchestrator | 2026-01-06 02:38:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:14.773746 | orchestrator | 2026-01-06 02:38:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:14.775646 | orchestrator | 2026-01-06 02:38:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:14.775704 | orchestrator | 2026-01-06 02:38:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:17.815831 | orchestrator | 2026-01-06 02:38:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:17.817337 | orchestrator | 2026-01-06 02:38:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:17.817573 | orchestrator | 2026-01-06 02:38:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:20.859836 | orchestrator | 2026-01-06 02:38:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:20.862130 | orchestrator | 2026-01-06 02:38:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:20.862246 | orchestrator | 2026-01-06 02:38:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:23.913349 | orchestrator | 2026-01-06 02:38:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:23.915012 | orchestrator | 2026-01-06 02:38:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:23.915222 | orchestrator | 2026-01-06 02:38:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:26.961712 | orchestrator | 2026-01-06 02:38:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:26.963922 | orchestrator | 2026-01-06 02:38:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:26.963969 | orchestrator | 2026-01-06 02:38:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:30.017453 | orchestrator | 2026-01-06 02:38:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:30.019617 | orchestrator | 2026-01-06 02:38:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:30.019686 | orchestrator | 2026-01-06 02:38:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:33.064792 | orchestrator | 2026-01-06 02:38:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:33.066591 | orchestrator | 2026-01-06 02:38:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:33.066682 | orchestrator | 2026-01-06 02:38:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:36.105313 | orchestrator | 2026-01-06 02:38:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:36.106713 | orchestrator | 2026-01-06 02:38:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:36.106775 | orchestrator | 2026-01-06 02:38:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:39.152109 | orchestrator | 2026-01-06 02:38:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:39.153352 | orchestrator | 2026-01-06 02:38:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:39.153424 | orchestrator | 2026-01-06 02:38:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:42.197294 | orchestrator | 2026-01-06 02:38:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:42.199964 | orchestrator | 2026-01-06 02:38:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:42.200014 | orchestrator | 2026-01-06 02:38:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:45.243083 | orchestrator | 2026-01-06 02:38:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:45.244829 | orchestrator | 2026-01-06 02:38:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:45.244964 | orchestrator | 2026-01-06 02:38:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:48.284775 | orchestrator | 2026-01-06 02:38:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:48.286223 | orchestrator | 2026-01-06 02:38:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:48.286264 | orchestrator | 2026-01-06 02:38:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:51.335543 | orchestrator | 2026-01-06 02:38:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:51.337383 | orchestrator | 2026-01-06 02:38:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:51.337404 | orchestrator | 2026-01-06 02:38:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:54.386828 | orchestrator | 2026-01-06 02:38:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:54.390549 | orchestrator | 2026-01-06 02:38:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:54.390784 | orchestrator | 2026-01-06 02:38:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:38:57.446382 | orchestrator | 2026-01-06 02:38:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:38:57.448453 | orchestrator | 2026-01-06 02:38:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:38:57.448561 | orchestrator | 2026-01-06 02:38:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:00.495605 | orchestrator | 2026-01-06 02:39:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:00.497272 | orchestrator | 2026-01-06 02:39:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:00.497325 | orchestrator | 2026-01-06 02:39:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:03.547862 | orchestrator | 2026-01-06 02:39:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:03.549517 | orchestrator | 2026-01-06 02:39:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:03.549680 | orchestrator | 2026-01-06 02:39:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:06.603995 | orchestrator | 2026-01-06 02:39:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:06.607125 | orchestrator | 2026-01-06 02:39:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:06.607182 | orchestrator | 2026-01-06 02:39:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:09.650127 | orchestrator | 2026-01-06 02:39:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:09.650791 | orchestrator | 2026-01-06 02:39:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:09.650864 | orchestrator | 2026-01-06 02:39:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:12.700711 | orchestrator | 2026-01-06 02:39:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:12.702210 | orchestrator | 2026-01-06 02:39:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:12.702249 | orchestrator | 2026-01-06 02:39:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:15.750975 | orchestrator | 2026-01-06 02:39:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:15.752573 | orchestrator | 2026-01-06 02:39:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:15.752719 | orchestrator | 2026-01-06 02:39:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:18.804259 | orchestrator | 2026-01-06 02:39:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:18.805392 | orchestrator | 2026-01-06 02:39:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:18.805510 | orchestrator | 2026-01-06 02:39:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:21.853927 | orchestrator | 2026-01-06 02:39:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:21.855182 | orchestrator | 2026-01-06 02:39:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:21.855269 | orchestrator | 2026-01-06 02:39:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:24.895965 | orchestrator | 2026-01-06 02:39:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:24.897512 | orchestrator | 2026-01-06 02:39:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:24.897545 | orchestrator | 2026-01-06 02:39:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:27.941413 | orchestrator | 2026-01-06 02:39:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:27.943601 | orchestrator | 2026-01-06 02:39:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:27.943647 | orchestrator | 2026-01-06 02:39:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:30.994343 | orchestrator | 2026-01-06 02:39:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:30.995727 | orchestrator | 2026-01-06 02:39:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:30.996118 | orchestrator | 2026-01-06 02:39:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:34.051534 | orchestrator | 2026-01-06 02:39:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:34.051623 | orchestrator | 2026-01-06 02:39:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:34.051630 | orchestrator | 2026-01-06 02:39:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:37.091678 | orchestrator | 2026-01-06 02:39:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:37.091971 | orchestrator | 2026-01-06 02:39:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:37.092037 | orchestrator | 2026-01-06 02:39:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:40.147721 | orchestrator | 2026-01-06 02:39:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:40.148260 | orchestrator | 2026-01-06 02:39:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:40.148295 | orchestrator | 2026-01-06 02:39:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:43.201376 | orchestrator | 2026-01-06 02:39:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:43.202292 | orchestrator | 2026-01-06 02:39:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:43.202373 | orchestrator | 2026-01-06 02:39:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:46.266467 | orchestrator | 2026-01-06 02:39:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:46.268533 | orchestrator | 2026-01-06 02:39:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:46.268574 | orchestrator | 2026-01-06 02:39:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:49.320232 | orchestrator | 2026-01-06 02:39:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:49.322313 | orchestrator | 2026-01-06 02:39:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:49.322415 | orchestrator | 2026-01-06 02:39:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:52.371216 | orchestrator | 2026-01-06 02:39:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:52.372949 | orchestrator | 2026-01-06 02:39:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:52.372987 | orchestrator | 2026-01-06 02:39:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:55.427459 | orchestrator | 2026-01-06 02:39:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:55.428771 | orchestrator | 2026-01-06 02:39:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:55.428878 | orchestrator | 2026-01-06 02:39:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:39:58.478416 | orchestrator | 2026-01-06 02:39:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:39:58.479056 | orchestrator | 2026-01-06 02:39:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:39:58.479230 | orchestrator | 2026-01-06 02:39:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:01.526946 | orchestrator | 2026-01-06 02:40:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:01.528289 | orchestrator | 2026-01-06 02:40:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:01.528422 | orchestrator | 2026-01-06 02:40:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:04.579622 | orchestrator | 2026-01-06 02:40:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:04.581760 | orchestrator | 2026-01-06 02:40:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:04.581861 | orchestrator | 2026-01-06 02:40:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:07.625840 | orchestrator | 2026-01-06 02:40:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:07.628028 | orchestrator | 2026-01-06 02:40:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:07.628144 | orchestrator | 2026-01-06 02:40:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:10.675994 | orchestrator | 2026-01-06 02:40:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:10.678168 | orchestrator | 2026-01-06 02:40:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:10.678427 | orchestrator | 2026-01-06 02:40:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:13.725888 | orchestrator | 2026-01-06 02:40:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:13.727306 | orchestrator | 2026-01-06 02:40:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:13.727377 | orchestrator | 2026-01-06 02:40:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:16.768738 | orchestrator | 2026-01-06 02:40:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:16.770005 | orchestrator | 2026-01-06 02:40:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:16.770120 | orchestrator | 2026-01-06 02:40:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:19.817831 | orchestrator | 2026-01-06 02:40:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:19.819488 | orchestrator | 2026-01-06 02:40:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:19.819657 | orchestrator | 2026-01-06 02:40:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:22.866102 | orchestrator | 2026-01-06 02:40:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:22.867784 | orchestrator | 2026-01-06 02:40:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:22.867840 | orchestrator | 2026-01-06 02:40:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:25.917908 | orchestrator | 2026-01-06 02:40:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:25.920622 | orchestrator | 2026-01-06 02:40:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:25.920674 | orchestrator | 2026-01-06 02:40:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:28.965885 | orchestrator | 2026-01-06 02:40:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:28.966672 | orchestrator | 2026-01-06 02:40:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:28.966942 | orchestrator | 2026-01-06 02:40:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:32.012922 | orchestrator | 2026-01-06 02:40:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:32.014594 | orchestrator | 2026-01-06 02:40:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:32.014636 | orchestrator | 2026-01-06 02:40:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:35.067052 | orchestrator | 2026-01-06 02:40:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:35.068978 | orchestrator | 2026-01-06 02:40:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:35.069014 | orchestrator | 2026-01-06 02:40:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:38.111300 | orchestrator | 2026-01-06 02:40:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:38.113132 | orchestrator | 2026-01-06 02:40:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:38.113178 | orchestrator | 2026-01-06 02:40:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:41.150662 | orchestrator | 2026-01-06 02:40:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:41.152622 | orchestrator | 2026-01-06 02:40:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:41.152797 | orchestrator | 2026-01-06 02:40:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:44.200797 | orchestrator | 2026-01-06 02:40:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:44.202997 | orchestrator | 2026-01-06 02:40:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:44.203102 | orchestrator | 2026-01-06 02:40:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:47.259786 | orchestrator | 2026-01-06 02:40:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:47.262091 | orchestrator | 2026-01-06 02:40:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:47.262109 | orchestrator | 2026-01-06 02:40:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:50.306637 | orchestrator | 2026-01-06 02:40:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:50.307099 | orchestrator | 2026-01-06 02:40:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:50.307118 | orchestrator | 2026-01-06 02:40:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:53.356206 | orchestrator | 2026-01-06 02:40:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:53.357798 | orchestrator | 2026-01-06 02:40:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:53.357873 | orchestrator | 2026-01-06 02:40:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:56.402173 | orchestrator | 2026-01-06 02:40:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:56.404146 | orchestrator | 2026-01-06 02:40:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:56.404234 | orchestrator | 2026-01-06 02:40:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:40:59.442936 | orchestrator | 2026-01-06 02:40:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:40:59.444811 | orchestrator | 2026-01-06 02:40:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:40:59.444910 | orchestrator | 2026-01-06 02:40:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:02.494934 | orchestrator | 2026-01-06 02:41:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:02.497235 | orchestrator | 2026-01-06 02:41:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:02.497310 | orchestrator | 2026-01-06 02:41:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:05.543363 | orchestrator | 2026-01-06 02:41:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:05.544150 | orchestrator | 2026-01-06 02:41:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:05.544229 | orchestrator | 2026-01-06 02:41:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:08.587055 | orchestrator | 2026-01-06 02:41:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:08.588837 | orchestrator | 2026-01-06 02:41:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:08.588892 | orchestrator | 2026-01-06 02:41:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:11.635059 | orchestrator | 2026-01-06 02:41:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:11.636591 | orchestrator | 2026-01-06 02:41:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:11.636619 | orchestrator | 2026-01-06 02:41:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:14.689032 | orchestrator | 2026-01-06 02:41:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:14.690304 | orchestrator | 2026-01-06 02:41:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:14.690352 | orchestrator | 2026-01-06 02:41:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:17.735194 | orchestrator | 2026-01-06 02:41:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:17.737736 | orchestrator | 2026-01-06 02:41:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:17.737784 | orchestrator | 2026-01-06 02:41:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:20.792187 | orchestrator | 2026-01-06 02:41:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:20.794104 | orchestrator | 2026-01-06 02:41:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:20.794154 | orchestrator | 2026-01-06 02:41:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:23.843920 | orchestrator | 2026-01-06 02:41:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:23.845654 | orchestrator | 2026-01-06 02:41:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:23.845925 | orchestrator | 2026-01-06 02:41:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:26.892449 | orchestrator | 2026-01-06 02:41:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:26.893917 | orchestrator | 2026-01-06 02:41:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:26.894013 | orchestrator | 2026-01-06 02:41:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:29.942288 | orchestrator | 2026-01-06 02:41:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:29.944276 | orchestrator | 2026-01-06 02:41:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:29.944426 | orchestrator | 2026-01-06 02:41:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:32.993028 | orchestrator | 2026-01-06 02:41:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:32.995645 | orchestrator | 2026-01-06 02:41:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:32.995729 | orchestrator | 2026-01-06 02:41:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:36.045991 | orchestrator | 2026-01-06 02:41:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:36.047999 | orchestrator | 2026-01-06 02:41:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:36.048052 | orchestrator | 2026-01-06 02:41:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:39.092564 | orchestrator | 2026-01-06 02:41:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:39.095001 | orchestrator | 2026-01-06 02:41:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:39.095070 | orchestrator | 2026-01-06 02:41:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:42.145719 | orchestrator | 2026-01-06 02:41:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:42.147584 | orchestrator | 2026-01-06 02:41:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:42.147690 | orchestrator | 2026-01-06 02:41:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:45.196453 | orchestrator | 2026-01-06 02:41:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:45.200232 | orchestrator | 2026-01-06 02:41:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:45.200286 | orchestrator | 2026-01-06 02:41:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:48.253441 | orchestrator | 2026-01-06 02:41:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:48.254791 | orchestrator | 2026-01-06 02:41:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:48.254840 | orchestrator | 2026-01-06 02:41:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:51.297949 | orchestrator | 2026-01-06 02:41:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:51.300238 | orchestrator | 2026-01-06 02:41:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:51.300432 | orchestrator | 2026-01-06 02:41:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:54.351024 | orchestrator | 2026-01-06 02:41:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:54.353607 | orchestrator | 2026-01-06 02:41:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:54.353814 | orchestrator | 2026-01-06 02:41:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:41:57.404084 | orchestrator | 2026-01-06 02:41:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:41:57.406286 | orchestrator | 2026-01-06 02:41:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:41:57.406336 | orchestrator | 2026-01-06 02:41:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:00.454492 | orchestrator | 2026-01-06 02:42:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:00.455766 | orchestrator | 2026-01-06 02:42:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:00.455857 | orchestrator | 2026-01-06 02:42:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:03.504199 | orchestrator | 2026-01-06 02:42:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:03.505987 | orchestrator | 2026-01-06 02:42:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:03.506122 | orchestrator | 2026-01-06 02:42:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:06.549551 | orchestrator | 2026-01-06 02:42:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:06.550654 | orchestrator | 2026-01-06 02:42:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:06.550726 | orchestrator | 2026-01-06 02:42:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:09.599039 | orchestrator | 2026-01-06 02:42:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:09.601160 | orchestrator | 2026-01-06 02:42:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:09.601216 | orchestrator | 2026-01-06 02:42:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:12.650084 | orchestrator | 2026-01-06 02:42:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:12.651481 | orchestrator | 2026-01-06 02:42:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:12.651539 | orchestrator | 2026-01-06 02:42:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:15.698203 | orchestrator | 2026-01-06 02:42:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:15.699466 | orchestrator | 2026-01-06 02:42:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:15.699503 | orchestrator | 2026-01-06 02:42:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:18.743029 | orchestrator | 2026-01-06 02:42:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:18.745462 | orchestrator | 2026-01-06 02:42:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:18.745529 | orchestrator | 2026-01-06 02:42:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:21.789779 | orchestrator | 2026-01-06 02:42:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:21.790336 | orchestrator | 2026-01-06 02:42:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:21.790418 | orchestrator | 2026-01-06 02:42:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:24.841676 | orchestrator | 2026-01-06 02:42:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:24.844445 | orchestrator | 2026-01-06 02:42:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:24.844498 | orchestrator | 2026-01-06 02:42:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:27.893227 | orchestrator | 2026-01-06 02:42:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:27.894801 | orchestrator | 2026-01-06 02:42:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:27.895011 | orchestrator | 2026-01-06 02:42:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:30.942229 | orchestrator | 2026-01-06 02:42:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:30.944066 | orchestrator | 2026-01-06 02:42:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:30.944128 | orchestrator | 2026-01-06 02:42:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:33.996547 | orchestrator | 2026-01-06 02:42:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:34.006740 | orchestrator | 2026-01-06 02:42:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:34.006839 | orchestrator | 2026-01-06 02:42:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:37.056970 | orchestrator | 2026-01-06 02:42:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:37.058691 | orchestrator | 2026-01-06 02:42:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:37.058757 | orchestrator | 2026-01-06 02:42:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:40.102737 | orchestrator | 2026-01-06 02:42:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:40.104752 | orchestrator | 2026-01-06 02:42:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:40.105276 | orchestrator | 2026-01-06 02:42:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:43.156493 | orchestrator | 2026-01-06 02:42:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:43.158258 | orchestrator | 2026-01-06 02:42:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:43.158488 | orchestrator | 2026-01-06 02:42:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:46.203051 | orchestrator | 2026-01-06 02:42:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:46.205543 | orchestrator | 2026-01-06 02:42:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:46.205598 | orchestrator | 2026-01-06 02:42:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:49.252701 | orchestrator | 2026-01-06 02:42:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:49.254372 | orchestrator | 2026-01-06 02:42:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:49.254440 | orchestrator | 2026-01-06 02:42:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:52.303411 | orchestrator | 2026-01-06 02:42:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:52.305200 | orchestrator | 2026-01-06 02:42:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:52.305264 | orchestrator | 2026-01-06 02:42:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:55.356047 | orchestrator | 2026-01-06 02:42:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:55.358923 | orchestrator | 2026-01-06 02:42:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:55.359070 | orchestrator | 2026-01-06 02:42:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:42:58.409119 | orchestrator | 2026-01-06 02:42:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:42:58.409986 | orchestrator | 2026-01-06 02:42:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:42:58.410442 | orchestrator | 2026-01-06 02:42:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:01.458943 | orchestrator | 2026-01-06 02:43:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:01.460807 | orchestrator | 2026-01-06 02:43:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:01.460852 | orchestrator | 2026-01-06 02:43:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:04.512089 | orchestrator | 2026-01-06 02:43:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:04.513482 | orchestrator | 2026-01-06 02:43:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:04.513638 | orchestrator | 2026-01-06 02:43:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:07.562456 | orchestrator | 2026-01-06 02:43:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:07.564925 | orchestrator | 2026-01-06 02:43:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:07.564971 | orchestrator | 2026-01-06 02:43:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:10.615988 | orchestrator | 2026-01-06 02:43:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:10.619005 | orchestrator | 2026-01-06 02:43:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:10.619111 | orchestrator | 2026-01-06 02:43:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:13.663013 | orchestrator | 2026-01-06 02:43:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:13.664643 | orchestrator | 2026-01-06 02:43:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:13.664702 | orchestrator | 2026-01-06 02:43:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:16.713904 | orchestrator | 2026-01-06 02:43:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:16.716638 | orchestrator | 2026-01-06 02:43:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:16.716762 | orchestrator | 2026-01-06 02:43:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:19.768672 | orchestrator | 2026-01-06 02:43:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:19.770281 | orchestrator | 2026-01-06 02:43:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:19.770356 | orchestrator | 2026-01-06 02:43:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:22.819893 | orchestrator | 2026-01-06 02:43:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:22.821199 | orchestrator | 2026-01-06 02:43:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:22.821239 | orchestrator | 2026-01-06 02:43:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:25.861098 | orchestrator | 2026-01-06 02:43:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:25.862482 | orchestrator | 2026-01-06 02:43:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:25.862702 | orchestrator | 2026-01-06 02:43:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:28.908570 | orchestrator | 2026-01-06 02:43:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:28.911981 | orchestrator | 2026-01-06 02:43:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:28.912046 | orchestrator | 2026-01-06 02:43:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:31.959172 | orchestrator | 2026-01-06 02:43:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:31.960758 | orchestrator | 2026-01-06 02:43:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:31.960817 | orchestrator | 2026-01-06 02:43:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:35.022781 | orchestrator | 2026-01-06 02:43:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:35.024572 | orchestrator | 2026-01-06 02:43:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:35.024632 | orchestrator | 2026-01-06 02:43:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:38.075708 | orchestrator | 2026-01-06 02:43:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:38.077622 | orchestrator | 2026-01-06 02:43:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:38.077672 | orchestrator | 2026-01-06 02:43:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:41.123622 | orchestrator | 2026-01-06 02:43:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:41.124562 | orchestrator | 2026-01-06 02:43:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:41.124595 | orchestrator | 2026-01-06 02:43:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:44.176825 | orchestrator | 2026-01-06 02:43:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:44.179300 | orchestrator | 2026-01-06 02:43:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:44.179353 | orchestrator | 2026-01-06 02:43:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:47.225749 | orchestrator | 2026-01-06 02:43:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:47.226864 | orchestrator | 2026-01-06 02:43:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:47.226891 | orchestrator | 2026-01-06 02:43:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:50.276704 | orchestrator | 2026-01-06 02:43:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:50.277550 | orchestrator | 2026-01-06 02:43:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:50.277619 | orchestrator | 2026-01-06 02:43:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:53.328189 | orchestrator | 2026-01-06 02:43:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:53.329785 | orchestrator | 2026-01-06 02:43:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:53.329837 | orchestrator | 2026-01-06 02:43:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:56.370262 | orchestrator | 2026-01-06 02:43:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:56.370444 | orchestrator | 2026-01-06 02:43:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:56.370503 | orchestrator | 2026-01-06 02:43:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:43:59.411529 | orchestrator | 2026-01-06 02:43:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:43:59.412956 | orchestrator | 2026-01-06 02:43:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:43:59.413010 | orchestrator | 2026-01-06 02:43:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:02.455051 | orchestrator | 2026-01-06 02:44:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:02.456385 | orchestrator | 2026-01-06 02:44:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:02.456657 | orchestrator | 2026-01-06 02:44:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:05.502290 | orchestrator | 2026-01-06 02:44:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:05.504802 | orchestrator | 2026-01-06 02:44:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:05.504921 | orchestrator | 2026-01-06 02:44:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:08.551828 | orchestrator | 2026-01-06 02:44:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:08.553166 | orchestrator | 2026-01-06 02:44:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:08.553188 | orchestrator | 2026-01-06 02:44:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:11.599425 | orchestrator | 2026-01-06 02:44:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:11.600317 | orchestrator | 2026-01-06 02:44:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:11.600356 | orchestrator | 2026-01-06 02:44:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:14.645756 | orchestrator | 2026-01-06 02:44:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:14.648529 | orchestrator | 2026-01-06 02:44:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:14.648592 | orchestrator | 2026-01-06 02:44:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:17.692332 | orchestrator | 2026-01-06 02:44:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:17.695088 | orchestrator | 2026-01-06 02:44:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:17.695147 | orchestrator | 2026-01-06 02:44:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:20.746237 | orchestrator | 2026-01-06 02:44:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:20.747309 | orchestrator | 2026-01-06 02:44:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:20.747363 | orchestrator | 2026-01-06 02:44:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:23.801621 | orchestrator | 2026-01-06 02:44:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:23.803507 | orchestrator | 2026-01-06 02:44:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:23.803569 | orchestrator | 2026-01-06 02:44:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:26.853780 | orchestrator | 2026-01-06 02:44:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:26.855072 | orchestrator | 2026-01-06 02:44:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:26.855120 | orchestrator | 2026-01-06 02:44:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:29.904539 | orchestrator | 2026-01-06 02:44:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:29.906578 | orchestrator | 2026-01-06 02:44:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:29.906680 | orchestrator | 2026-01-06 02:44:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:32.959292 | orchestrator | 2026-01-06 02:44:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:32.961252 | orchestrator | 2026-01-06 02:44:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:32.961344 | orchestrator | 2026-01-06 02:44:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:36.005867 | orchestrator | 2026-01-06 02:44:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:36.006924 | orchestrator | 2026-01-06 02:44:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:36.006966 | orchestrator | 2026-01-06 02:44:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:39.056931 | orchestrator | 2026-01-06 02:44:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:39.058274 | orchestrator | 2026-01-06 02:44:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:39.058337 | orchestrator | 2026-01-06 02:44:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:42.106319 | orchestrator | 2026-01-06 02:44:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:42.108785 | orchestrator | 2026-01-06 02:44:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:42.108834 | orchestrator | 2026-01-06 02:44:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:45.152286 | orchestrator | 2026-01-06 02:44:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:45.153810 | orchestrator | 2026-01-06 02:44:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:45.153863 | orchestrator | 2026-01-06 02:44:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:48.201276 | orchestrator | 2026-01-06 02:44:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:48.204047 | orchestrator | 2026-01-06 02:44:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:48.204160 | orchestrator | 2026-01-06 02:44:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:51.250351 | orchestrator | 2026-01-06 02:44:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:51.252109 | orchestrator | 2026-01-06 02:44:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:51.252156 | orchestrator | 2026-01-06 02:44:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:54.300362 | orchestrator | 2026-01-06 02:44:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:54.302764 | orchestrator | 2026-01-06 02:44:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:54.302818 | orchestrator | 2026-01-06 02:44:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:44:57.351599 | orchestrator | 2026-01-06 02:44:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:44:57.353215 | orchestrator | 2026-01-06 02:44:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:44:57.353305 | orchestrator | 2026-01-06 02:44:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:00.399425 | orchestrator | 2026-01-06 02:45:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:00.401742 | orchestrator | 2026-01-06 02:45:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:00.402167 | orchestrator | 2026-01-06 02:45:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:03.452908 | orchestrator | 2026-01-06 02:45:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:03.454211 | orchestrator | 2026-01-06 02:45:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:03.454275 | orchestrator | 2026-01-06 02:45:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:06.504868 | orchestrator | 2026-01-06 02:45:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:06.507268 | orchestrator | 2026-01-06 02:45:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:06.507567 | orchestrator | 2026-01-06 02:45:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:09.555704 | orchestrator | 2026-01-06 02:45:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:09.557656 | orchestrator | 2026-01-06 02:45:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:09.557759 | orchestrator | 2026-01-06 02:45:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:12.601100 | orchestrator | 2026-01-06 02:45:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:12.602357 | orchestrator | 2026-01-06 02:45:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:12.602488 | orchestrator | 2026-01-06 02:45:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:15.641981 | orchestrator | 2026-01-06 02:45:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:15.643879 | orchestrator | 2026-01-06 02:45:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:15.643917 | orchestrator | 2026-01-06 02:45:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:18.686743 | orchestrator | 2026-01-06 02:45:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:18.688121 | orchestrator | 2026-01-06 02:45:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:18.688296 | orchestrator | 2026-01-06 02:45:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:21.731252 | orchestrator | 2026-01-06 02:45:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:21.735393 | orchestrator | 2026-01-06 02:45:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:21.735489 | orchestrator | 2026-01-06 02:45:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:24.784513 | orchestrator | 2026-01-06 02:45:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:24.786280 | orchestrator | 2026-01-06 02:45:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:24.786340 | orchestrator | 2026-01-06 02:45:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:27.832931 | orchestrator | 2026-01-06 02:45:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:27.834484 | orchestrator | 2026-01-06 02:45:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:27.834531 | orchestrator | 2026-01-06 02:45:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:30.886244 | orchestrator | 2026-01-06 02:45:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:30.887182 | orchestrator | 2026-01-06 02:45:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:30.887202 | orchestrator | 2026-01-06 02:45:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:33.935451 | orchestrator | 2026-01-06 02:45:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:33.936820 | orchestrator | 2026-01-06 02:45:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:33.936960 | orchestrator | 2026-01-06 02:45:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:36.990718 | orchestrator | 2026-01-06 02:45:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:36.991989 | orchestrator | 2026-01-06 02:45:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:36.992079 | orchestrator | 2026-01-06 02:45:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:40.045087 | orchestrator | 2026-01-06 02:45:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:40.046597 | orchestrator | 2026-01-06 02:45:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:40.046666 | orchestrator | 2026-01-06 02:45:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:43.096022 | orchestrator | 2026-01-06 02:45:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:43.098185 | orchestrator | 2026-01-06 02:45:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:43.098549 | orchestrator | 2026-01-06 02:45:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:46.148372 | orchestrator | 2026-01-06 02:45:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:46.150835 | orchestrator | 2026-01-06 02:45:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:46.150909 | orchestrator | 2026-01-06 02:45:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:49.195584 | orchestrator | 2026-01-06 02:45:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:49.196973 | orchestrator | 2026-01-06 02:45:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:49.197024 | orchestrator | 2026-01-06 02:45:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:52.247889 | orchestrator | 2026-01-06 02:45:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:52.249731 | orchestrator | 2026-01-06 02:45:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:52.249798 | orchestrator | 2026-01-06 02:45:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:55.303187 | orchestrator | 2026-01-06 02:45:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:55.305087 | orchestrator | 2026-01-06 02:45:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:55.305165 | orchestrator | 2026-01-06 02:45:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:45:58.354910 | orchestrator | 2026-01-06 02:45:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:45:58.356702 | orchestrator | 2026-01-06 02:45:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:45:58.356756 | orchestrator | 2026-01-06 02:45:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:01.408652 | orchestrator | 2026-01-06 02:46:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:01.409844 | orchestrator | 2026-01-06 02:46:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:01.409948 | orchestrator | 2026-01-06 02:46:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:04.458604 | orchestrator | 2026-01-06 02:46:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:04.460399 | orchestrator | 2026-01-06 02:46:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:04.460452 | orchestrator | 2026-01-06 02:46:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:07.517090 | orchestrator | 2026-01-06 02:46:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:07.518282 | orchestrator | 2026-01-06 02:46:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:07.518374 | orchestrator | 2026-01-06 02:46:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:10.563274 | orchestrator | 2026-01-06 02:46:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:10.565369 | orchestrator | 2026-01-06 02:46:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:10.565445 | orchestrator | 2026-01-06 02:46:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:13.611816 | orchestrator | 2026-01-06 02:46:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:13.613493 | orchestrator | 2026-01-06 02:46:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:13.613522 | orchestrator | 2026-01-06 02:46:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:16.669859 | orchestrator | 2026-01-06 02:46:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:16.671529 | orchestrator | 2026-01-06 02:46:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:16.671563 | orchestrator | 2026-01-06 02:46:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:19.721400 | orchestrator | 2026-01-06 02:46:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:19.722708 | orchestrator | 2026-01-06 02:46:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:19.722826 | orchestrator | 2026-01-06 02:46:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:22.770699 | orchestrator | 2026-01-06 02:46:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:22.772349 | orchestrator | 2026-01-06 02:46:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:22.772391 | orchestrator | 2026-01-06 02:46:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:25.822835 | orchestrator | 2026-01-06 02:46:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:25.823928 | orchestrator | 2026-01-06 02:46:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:25.823970 | orchestrator | 2026-01-06 02:46:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:28.868409 | orchestrator | 2026-01-06 02:46:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:28.869074 | orchestrator | 2026-01-06 02:46:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:28.869387 | orchestrator | 2026-01-06 02:46:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:31.908777 | orchestrator | 2026-01-06 02:46:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:31.911045 | orchestrator | 2026-01-06 02:46:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:31.911102 | orchestrator | 2026-01-06 02:46:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:34.950221 | orchestrator | 2026-01-06 02:46:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:34.951595 | orchestrator | 2026-01-06 02:46:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:34.951657 | orchestrator | 2026-01-06 02:46:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:37.991965 | orchestrator | 2026-01-06 02:46:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:37.995452 | orchestrator | 2026-01-06 02:46:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:37.995519 | orchestrator | 2026-01-06 02:46:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:41.043819 | orchestrator | 2026-01-06 02:46:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:41.044674 | orchestrator | 2026-01-06 02:46:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:41.044762 | orchestrator | 2026-01-06 02:46:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:44.088100 | orchestrator | 2026-01-06 02:46:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:44.088315 | orchestrator | 2026-01-06 02:46:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:44.088337 | orchestrator | 2026-01-06 02:46:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:47.131622 | orchestrator | 2026-01-06 02:46:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:47.133875 | orchestrator | 2026-01-06 02:46:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:47.134004 | orchestrator | 2026-01-06 02:46:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:50.179377 | orchestrator | 2026-01-06 02:46:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:50.180864 | orchestrator | 2026-01-06 02:46:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:50.181115 | orchestrator | 2026-01-06 02:46:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:53.227359 | orchestrator | 2026-01-06 02:46:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:53.227439 | orchestrator | 2026-01-06 02:46:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:53.227451 | orchestrator | 2026-01-06 02:46:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:56.274435 | orchestrator | 2026-01-06 02:46:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:56.276229 | orchestrator | 2026-01-06 02:46:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:56.276416 | orchestrator | 2026-01-06 02:46:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:46:59.326921 | orchestrator | 2026-01-06 02:46:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:46:59.328329 | orchestrator | 2026-01-06 02:46:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:46:59.328721 | orchestrator | 2026-01-06 02:46:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:02.376014 | orchestrator | 2026-01-06 02:47:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:02.379124 | orchestrator | 2026-01-06 02:47:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:02.379205 | orchestrator | 2026-01-06 02:47:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:05.420171 | orchestrator | 2026-01-06 02:47:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:05.420680 | orchestrator | 2026-01-06 02:47:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:05.421185 | orchestrator | 2026-01-06 02:47:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:08.463965 | orchestrator | 2026-01-06 02:47:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:08.465680 | orchestrator | 2026-01-06 02:47:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:08.465737 | orchestrator | 2026-01-06 02:47:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:11.512859 | orchestrator | 2026-01-06 02:47:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:11.514995 | orchestrator | 2026-01-06 02:47:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:11.515039 | orchestrator | 2026-01-06 02:47:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:14.554127 | orchestrator | 2026-01-06 02:47:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:14.556367 | orchestrator | 2026-01-06 02:47:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:14.556487 | orchestrator | 2026-01-06 02:47:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:17.595204 | orchestrator | 2026-01-06 02:47:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:17.597843 | orchestrator | 2026-01-06 02:47:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:17.597952 | orchestrator | 2026-01-06 02:47:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:20.641744 | orchestrator | 2026-01-06 02:47:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:20.644093 | orchestrator | 2026-01-06 02:47:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:20.644176 | orchestrator | 2026-01-06 02:47:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:23.689177 | orchestrator | 2026-01-06 02:47:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:23.691051 | orchestrator | 2026-01-06 02:47:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:23.691141 | orchestrator | 2026-01-06 02:47:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:26.746352 | orchestrator | 2026-01-06 02:47:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:26.747449 | orchestrator | 2026-01-06 02:47:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:26.747482 | orchestrator | 2026-01-06 02:47:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:29.794658 | orchestrator | 2026-01-06 02:47:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:29.796356 | orchestrator | 2026-01-06 02:47:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:29.796401 | orchestrator | 2026-01-06 02:47:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:32.838328 | orchestrator | 2026-01-06 02:47:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:32.839874 | orchestrator | 2026-01-06 02:47:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:32.839937 | orchestrator | 2026-01-06 02:47:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:35.888528 | orchestrator | 2026-01-06 02:47:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:35.890718 | orchestrator | 2026-01-06 02:47:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:35.890836 | orchestrator | 2026-01-06 02:47:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:38.941312 | orchestrator | 2026-01-06 02:47:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:38.943645 | orchestrator | 2026-01-06 02:47:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:38.943715 | orchestrator | 2026-01-06 02:47:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:41.995827 | orchestrator | 2026-01-06 02:47:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:41.998452 | orchestrator | 2026-01-06 02:47:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:41.998520 | orchestrator | 2026-01-06 02:47:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:45.049852 | orchestrator | 2026-01-06 02:47:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:45.051254 | orchestrator | 2026-01-06 02:47:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:45.051320 | orchestrator | 2026-01-06 02:47:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:48.107350 | orchestrator | 2026-01-06 02:47:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:48.109059 | orchestrator | 2026-01-06 02:47:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:48.109218 | orchestrator | 2026-01-06 02:47:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:51.154848 | orchestrator | 2026-01-06 02:47:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:51.155021 | orchestrator | 2026-01-06 02:47:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:51.155269 | orchestrator | 2026-01-06 02:47:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:54.204243 | orchestrator | 2026-01-06 02:47:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:54.205381 | orchestrator | 2026-01-06 02:47:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:54.205431 | orchestrator | 2026-01-06 02:47:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:47:57.243645 | orchestrator | 2026-01-06 02:47:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:47:57.245044 | orchestrator | 2026-01-06 02:47:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:47:57.245234 | orchestrator | 2026-01-06 02:47:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:00.292234 | orchestrator | 2026-01-06 02:48:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:00.293473 | orchestrator | 2026-01-06 02:48:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:00.293597 | orchestrator | 2026-01-06 02:48:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:03.344728 | orchestrator | 2026-01-06 02:48:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:03.345621 | orchestrator | 2026-01-06 02:48:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:03.345674 | orchestrator | 2026-01-06 02:48:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:06.392310 | orchestrator | 2026-01-06 02:48:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:06.393919 | orchestrator | 2026-01-06 02:48:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:06.394518 | orchestrator | 2026-01-06 02:48:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:09.441015 | orchestrator | 2026-01-06 02:48:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:09.442347 | orchestrator | 2026-01-06 02:48:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:09.442383 | orchestrator | 2026-01-06 02:48:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:12.485283 | orchestrator | 2026-01-06 02:48:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:12.487214 | orchestrator | 2026-01-06 02:48:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:12.487315 | orchestrator | 2026-01-06 02:48:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:15.540502 | orchestrator | 2026-01-06 02:48:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:15.542313 | orchestrator | 2026-01-06 02:48:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:15.542418 | orchestrator | 2026-01-06 02:48:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:18.595797 | orchestrator | 2026-01-06 02:48:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:18.597077 | orchestrator | 2026-01-06 02:48:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:18.597199 | orchestrator | 2026-01-06 02:48:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:21.644276 | orchestrator | 2026-01-06 02:48:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:21.646726 | orchestrator | 2026-01-06 02:48:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:21.646809 | orchestrator | 2026-01-06 02:48:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:24.692471 | orchestrator | 2026-01-06 02:48:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:24.693432 | orchestrator | 2026-01-06 02:48:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:24.693467 | orchestrator | 2026-01-06 02:48:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:27.743643 | orchestrator | 2026-01-06 02:48:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:27.744606 | orchestrator | 2026-01-06 02:48:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:27.744694 | orchestrator | 2026-01-06 02:48:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:30.791329 | orchestrator | 2026-01-06 02:48:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:30.793215 | orchestrator | 2026-01-06 02:48:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:30.793288 | orchestrator | 2026-01-06 02:48:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:33.836890 | orchestrator | 2026-01-06 02:48:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:33.838855 | orchestrator | 2026-01-06 02:48:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:33.838901 | orchestrator | 2026-01-06 02:48:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:36.884562 | orchestrator | 2026-01-06 02:48:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:36.886746 | orchestrator | 2026-01-06 02:48:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:36.886847 | orchestrator | 2026-01-06 02:48:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:39.931458 | orchestrator | 2026-01-06 02:48:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:39.932809 | orchestrator | 2026-01-06 02:48:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:39.932842 | orchestrator | 2026-01-06 02:48:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:42.978407 | orchestrator | 2026-01-06 02:48:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:42.979380 | orchestrator | 2026-01-06 02:48:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:42.979412 | orchestrator | 2026-01-06 02:48:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:46.026259 | orchestrator | 2026-01-06 02:48:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:46.028754 | orchestrator | 2026-01-06 02:48:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:46.028822 | orchestrator | 2026-01-06 02:48:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:49.068796 | orchestrator | 2026-01-06 02:48:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:49.070312 | orchestrator | 2026-01-06 02:48:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:49.070381 | orchestrator | 2026-01-06 02:48:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:52.111513 | orchestrator | 2026-01-06 02:48:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:52.114144 | orchestrator | 2026-01-06 02:48:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:52.114388 | orchestrator | 2026-01-06 02:48:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:55.155123 | orchestrator | 2026-01-06 02:48:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:55.156934 | orchestrator | 2026-01-06 02:48:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:55.156987 | orchestrator | 2026-01-06 02:48:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:48:58.201878 | orchestrator | 2026-01-06 02:48:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:48:58.204482 | orchestrator | 2026-01-06 02:48:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:48:58.204573 | orchestrator | 2026-01-06 02:48:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:01.251305 | orchestrator | 2026-01-06 02:49:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:01.252669 | orchestrator | 2026-01-06 02:49:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:01.252704 | orchestrator | 2026-01-06 02:49:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:04.296638 | orchestrator | 2026-01-06 02:49:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:04.300500 | orchestrator | 2026-01-06 02:49:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:04.300593 | orchestrator | 2026-01-06 02:49:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:07.343945 | orchestrator | 2026-01-06 02:49:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:07.346331 | orchestrator | 2026-01-06 02:49:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:07.346417 | orchestrator | 2026-01-06 02:49:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:10.391794 | orchestrator | 2026-01-06 02:49:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:10.393778 | orchestrator | 2026-01-06 02:49:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:10.394055 | orchestrator | 2026-01-06 02:49:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:13.432940 | orchestrator | 2026-01-06 02:49:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:13.433794 | orchestrator | 2026-01-06 02:49:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:13.434050 | orchestrator | 2026-01-06 02:49:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:16.472050 | orchestrator | 2026-01-06 02:49:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:16.473732 | orchestrator | 2026-01-06 02:49:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:16.473784 | orchestrator | 2026-01-06 02:49:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:19.526131 | orchestrator | 2026-01-06 02:49:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:19.528365 | orchestrator | 2026-01-06 02:49:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:19.528410 | orchestrator | 2026-01-06 02:49:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:22.573828 | orchestrator | 2026-01-06 02:49:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:22.575467 | orchestrator | 2026-01-06 02:49:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:22.575524 | orchestrator | 2026-01-06 02:49:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:25.623062 | orchestrator | 2026-01-06 02:49:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:25.625116 | orchestrator | 2026-01-06 02:49:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:25.625196 | orchestrator | 2026-01-06 02:49:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:28.677461 | orchestrator | 2026-01-06 02:49:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:28.678352 | orchestrator | 2026-01-06 02:49:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:28.678676 | orchestrator | 2026-01-06 02:49:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:31.733399 | orchestrator | 2026-01-06 02:49:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:31.735243 | orchestrator | 2026-01-06 02:49:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:31.735279 | orchestrator | 2026-01-06 02:49:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:34.782351 | orchestrator | 2026-01-06 02:49:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:34.783604 | orchestrator | 2026-01-06 02:49:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:34.783780 | orchestrator | 2026-01-06 02:49:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:37.819315 | orchestrator | 2026-01-06 02:49:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:37.821516 | orchestrator | 2026-01-06 02:49:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:37.821592 | orchestrator | 2026-01-06 02:49:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:40.862917 | orchestrator | 2026-01-06 02:49:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:40.864670 | orchestrator | 2026-01-06 02:49:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:40.864723 | orchestrator | 2026-01-06 02:49:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:43.906980 | orchestrator | 2026-01-06 02:49:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:43.909069 | orchestrator | 2026-01-06 02:49:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:43.909149 | orchestrator | 2026-01-06 02:49:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:46.953288 | orchestrator | 2026-01-06 02:49:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:46.955530 | orchestrator | 2026-01-06 02:49:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:46.955600 | orchestrator | 2026-01-06 02:49:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:50.003702 | orchestrator | 2026-01-06 02:49:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:50.004877 | orchestrator | 2026-01-06 02:49:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:50.004930 | orchestrator | 2026-01-06 02:49:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:53.058786 | orchestrator | 2026-01-06 02:49:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:53.060263 | orchestrator | 2026-01-06 02:49:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:53.060300 | orchestrator | 2026-01-06 02:49:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:56.105654 | orchestrator | 2026-01-06 02:49:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:56.106491 | orchestrator | 2026-01-06 02:49:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:56.106560 | orchestrator | 2026-01-06 02:49:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:49:59.152494 | orchestrator | 2026-01-06 02:49:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:49:59.154732 | orchestrator | 2026-01-06 02:49:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:49:59.154804 | orchestrator | 2026-01-06 02:49:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:02.197537 | orchestrator | 2026-01-06 02:50:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:02.199862 | orchestrator | 2026-01-06 02:50:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:02.199926 | orchestrator | 2026-01-06 02:50:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:05.247269 | orchestrator | 2026-01-06 02:50:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:05.247759 | orchestrator | 2026-01-06 02:50:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:05.247773 | orchestrator | 2026-01-06 02:50:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:08.294242 | orchestrator | 2026-01-06 02:50:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:08.295549 | orchestrator | 2026-01-06 02:50:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:08.295654 | orchestrator | 2026-01-06 02:50:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:11.342870 | orchestrator | 2026-01-06 02:50:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:11.344498 | orchestrator | 2026-01-06 02:50:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:11.344550 | orchestrator | 2026-01-06 02:50:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:14.392321 | orchestrator | 2026-01-06 02:50:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:14.393844 | orchestrator | 2026-01-06 02:50:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:14.393913 | orchestrator | 2026-01-06 02:50:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:17.451342 | orchestrator | 2026-01-06 02:50:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:17.453350 | orchestrator | 2026-01-06 02:50:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:17.453468 | orchestrator | 2026-01-06 02:50:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:20.497869 | orchestrator | 2026-01-06 02:50:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:20.500148 | orchestrator | 2026-01-06 02:50:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:20.500207 | orchestrator | 2026-01-06 02:50:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:23.542584 | orchestrator | 2026-01-06 02:50:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:23.545150 | orchestrator | 2026-01-06 02:50:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:23.545211 | orchestrator | 2026-01-06 02:50:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:26.583388 | orchestrator | 2026-01-06 02:50:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:26.585086 | orchestrator | 2026-01-06 02:50:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:26.585143 | orchestrator | 2026-01-06 02:50:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:29.625713 | orchestrator | 2026-01-06 02:50:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:29.627847 | orchestrator | 2026-01-06 02:50:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:29.627932 | orchestrator | 2026-01-06 02:50:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:32.666442 | orchestrator | 2026-01-06 02:50:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:32.667983 | orchestrator | 2026-01-06 02:50:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:32.668068 | orchestrator | 2026-01-06 02:50:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:35.714717 | orchestrator | 2026-01-06 02:50:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:35.715752 | orchestrator | 2026-01-06 02:50:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:35.715776 | orchestrator | 2026-01-06 02:50:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:38.765830 | orchestrator | 2026-01-06 02:50:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:38.767080 | orchestrator | 2026-01-06 02:50:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:38.767127 | orchestrator | 2026-01-06 02:50:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:41.813557 | orchestrator | 2026-01-06 02:50:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:41.816526 | orchestrator | 2026-01-06 02:50:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:41.816597 | orchestrator | 2026-01-06 02:50:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:44.865766 | orchestrator | 2026-01-06 02:50:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:44.867309 | orchestrator | 2026-01-06 02:50:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:44.867362 | orchestrator | 2026-01-06 02:50:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:47.919750 | orchestrator | 2026-01-06 02:50:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:47.921630 | orchestrator | 2026-01-06 02:50:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:47.921772 | orchestrator | 2026-01-06 02:50:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:50.974486 | orchestrator | 2026-01-06 02:50:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:50.977542 | orchestrator | 2026-01-06 02:50:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:50.977601 | orchestrator | 2026-01-06 02:50:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:54.026466 | orchestrator | 2026-01-06 02:50:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:54.028739 | orchestrator | 2026-01-06 02:50:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:54.028852 | orchestrator | 2026-01-06 02:50:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:50:57.071514 | orchestrator | 2026-01-06 02:50:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:50:57.073681 | orchestrator | 2026-01-06 02:50:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:50:57.074153 | orchestrator | 2026-01-06 02:50:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:00.116133 | orchestrator | 2026-01-06 02:51:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:00.117545 | orchestrator | 2026-01-06 02:51:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:00.118486 | orchestrator | 2026-01-06 02:51:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:03.168462 | orchestrator | 2026-01-06 02:51:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:03.170762 | orchestrator | 2026-01-06 02:51:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:03.170815 | orchestrator | 2026-01-06 02:51:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:06.214138 | orchestrator | 2026-01-06 02:51:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:06.217928 | orchestrator | 2026-01-06 02:51:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:06.218122 | orchestrator | 2026-01-06 02:51:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:09.262352 | orchestrator | 2026-01-06 02:51:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:09.264067 | orchestrator | 2026-01-06 02:51:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:09.264217 | orchestrator | 2026-01-06 02:51:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:12.310725 | orchestrator | 2026-01-06 02:51:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:12.313158 | orchestrator | 2026-01-06 02:51:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:12.313210 | orchestrator | 2026-01-06 02:51:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:15.361103 | orchestrator | 2026-01-06 02:51:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:15.363578 | orchestrator | 2026-01-06 02:51:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:15.363624 | orchestrator | 2026-01-06 02:51:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:18.410257 | orchestrator | 2026-01-06 02:51:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:18.413536 | orchestrator | 2026-01-06 02:51:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:18.413643 | orchestrator | 2026-01-06 02:51:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:21.460535 | orchestrator | 2026-01-06 02:51:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:21.462641 | orchestrator | 2026-01-06 02:51:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:21.462734 | orchestrator | 2026-01-06 02:51:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:24.509459 | orchestrator | 2026-01-06 02:51:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:24.510717 | orchestrator | 2026-01-06 02:51:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:24.510746 | orchestrator | 2026-01-06 02:51:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:27.556517 | orchestrator | 2026-01-06 02:51:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:27.558095 | orchestrator | 2026-01-06 02:51:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:27.558168 | orchestrator | 2026-01-06 02:51:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:30.598113 | orchestrator | 2026-01-06 02:51:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:30.599169 | orchestrator | 2026-01-06 02:51:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:30.599196 | orchestrator | 2026-01-06 02:51:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:33.651696 | orchestrator | 2026-01-06 02:51:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:33.653639 | orchestrator | 2026-01-06 02:51:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:33.653699 | orchestrator | 2026-01-06 02:51:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:36.715135 | orchestrator | 2026-01-06 02:51:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:36.716054 | orchestrator | 2026-01-06 02:51:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:36.716102 | orchestrator | 2026-01-06 02:51:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:39.765998 | orchestrator | 2026-01-06 02:51:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:39.767573 | orchestrator | 2026-01-06 02:51:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:39.767852 | orchestrator | 2026-01-06 02:51:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:42.821403 | orchestrator | 2026-01-06 02:51:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:42.823583 | orchestrator | 2026-01-06 02:51:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:42.823648 | orchestrator | 2026-01-06 02:51:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:45.867644 | orchestrator | 2026-01-06 02:51:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:45.869283 | orchestrator | 2026-01-06 02:51:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:45.869318 | orchestrator | 2026-01-06 02:51:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:48.914842 | orchestrator | 2026-01-06 02:51:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:48.915935 | orchestrator | 2026-01-06 02:51:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:48.916029 | orchestrator | 2026-01-06 02:51:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:51.963705 | orchestrator | 2026-01-06 02:51:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:51.964891 | orchestrator | 2026-01-06 02:51:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:51.965006 | orchestrator | 2026-01-06 02:51:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:55.020467 | orchestrator | 2026-01-06 02:51:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:55.021719 | orchestrator | 2026-01-06 02:51:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:55.021769 | orchestrator | 2026-01-06 02:51:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:51:58.072194 | orchestrator | 2026-01-06 02:51:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:51:58.073078 | orchestrator | 2026-01-06 02:51:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:51:58.073197 | orchestrator | 2026-01-06 02:51:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:01.117537 | orchestrator | 2026-01-06 02:52:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:01.118683 | orchestrator | 2026-01-06 02:52:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:01.118745 | orchestrator | 2026-01-06 02:52:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:04.169041 | orchestrator | 2026-01-06 02:52:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:04.171013 | orchestrator | 2026-01-06 02:52:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:04.171071 | orchestrator | 2026-01-06 02:52:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:07.227127 | orchestrator | 2026-01-06 02:52:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:07.228984 | orchestrator | 2026-01-06 02:52:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:07.229114 | orchestrator | 2026-01-06 02:52:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:10.272943 | orchestrator | 2026-01-06 02:52:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:10.273875 | orchestrator | 2026-01-06 02:52:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:10.273941 | orchestrator | 2026-01-06 02:52:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:13.322468 | orchestrator | 2026-01-06 02:52:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:13.326495 | orchestrator | 2026-01-06 02:52:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:13.327118 | orchestrator | 2026-01-06 02:52:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:16.368990 | orchestrator | 2026-01-06 02:52:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:16.370097 | orchestrator | 2026-01-06 02:52:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:16.370191 | orchestrator | 2026-01-06 02:52:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:19.416570 | orchestrator | 2026-01-06 02:52:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:19.417265 | orchestrator | 2026-01-06 02:52:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:19.417295 | orchestrator | 2026-01-06 02:52:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:22.468587 | orchestrator | 2026-01-06 02:52:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:22.470719 | orchestrator | 2026-01-06 02:52:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:22.470898 | orchestrator | 2026-01-06 02:52:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:25.516396 | orchestrator | 2026-01-06 02:52:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:25.517714 | orchestrator | 2026-01-06 02:52:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:25.517751 | orchestrator | 2026-01-06 02:52:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:28.564335 | orchestrator | 2026-01-06 02:52:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:28.566155 | orchestrator | 2026-01-06 02:52:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:28.566204 | orchestrator | 2026-01-06 02:52:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:31.611959 | orchestrator | 2026-01-06 02:52:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:31.614483 | orchestrator | 2026-01-06 02:52:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:31.614565 | orchestrator | 2026-01-06 02:52:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:34.654621 | orchestrator | 2026-01-06 02:52:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:34.657296 | orchestrator | 2026-01-06 02:52:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:34.657354 | orchestrator | 2026-01-06 02:52:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:37.708054 | orchestrator | 2026-01-06 02:52:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:37.709205 | orchestrator | 2026-01-06 02:52:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:37.709249 | orchestrator | 2026-01-06 02:52:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:40.758442 | orchestrator | 2026-01-06 02:52:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:40.760198 | orchestrator | 2026-01-06 02:52:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:40.760415 | orchestrator | 2026-01-06 02:52:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:43.811826 | orchestrator | 2026-01-06 02:52:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:43.813570 | orchestrator | 2026-01-06 02:52:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:43.813631 | orchestrator | 2026-01-06 02:52:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:46.863745 | orchestrator | 2026-01-06 02:52:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:46.865662 | orchestrator | 2026-01-06 02:52:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:46.865780 | orchestrator | 2026-01-06 02:52:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:49.913794 | orchestrator | 2026-01-06 02:52:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:49.915019 | orchestrator | 2026-01-06 02:52:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:49.915139 | orchestrator | 2026-01-06 02:52:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:52.971494 | orchestrator | 2026-01-06 02:52:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:52.975268 | orchestrator | 2026-01-06 02:52:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:52.975373 | orchestrator | 2026-01-06 02:52:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:56.020480 | orchestrator | 2026-01-06 02:52:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:56.021946 | orchestrator | 2026-01-06 02:52:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:56.021994 | orchestrator | 2026-01-06 02:52:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:52:59.066410 | orchestrator | 2026-01-06 02:52:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:52:59.066895 | orchestrator | 2026-01-06 02:52:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:52:59.066960 | orchestrator | 2026-01-06 02:52:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:02.117621 | orchestrator | 2026-01-06 02:53:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:02.119876 | orchestrator | 2026-01-06 02:53:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:02.119939 | orchestrator | 2026-01-06 02:53:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:05.164358 | orchestrator | 2026-01-06 02:53:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:05.165154 | orchestrator | 2026-01-06 02:53:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:05.165362 | orchestrator | 2026-01-06 02:53:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:08.217717 | orchestrator | 2026-01-06 02:53:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:08.219392 | orchestrator | 2026-01-06 02:53:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:08.219520 | orchestrator | 2026-01-06 02:53:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:11.269249 | orchestrator | 2026-01-06 02:53:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:11.270767 | orchestrator | 2026-01-06 02:53:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:11.270994 | orchestrator | 2026-01-06 02:53:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:14.316576 | orchestrator | 2026-01-06 02:53:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:14.318447 | orchestrator | 2026-01-06 02:53:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:14.318515 | orchestrator | 2026-01-06 02:53:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:17.366414 | orchestrator | 2026-01-06 02:53:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:17.368524 | orchestrator | 2026-01-06 02:53:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:17.368587 | orchestrator | 2026-01-06 02:53:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:20.421347 | orchestrator | 2026-01-06 02:53:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:20.422584 | orchestrator | 2026-01-06 02:53:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:20.423055 | orchestrator | 2026-01-06 02:53:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:23.470709 | orchestrator | 2026-01-06 02:53:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:23.472401 | orchestrator | 2026-01-06 02:53:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:23.472571 | orchestrator | 2026-01-06 02:53:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:26.525191 | orchestrator | 2026-01-06 02:53:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:26.526253 | orchestrator | 2026-01-06 02:53:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:26.526390 | orchestrator | 2026-01-06 02:53:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:29.571620 | orchestrator | 2026-01-06 02:53:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:29.573384 | orchestrator | 2026-01-06 02:53:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:29.573557 | orchestrator | 2026-01-06 02:53:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:32.618390 | orchestrator | 2026-01-06 02:53:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:32.620241 | orchestrator | 2026-01-06 02:53:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:32.620337 | orchestrator | 2026-01-06 02:53:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:35.667367 | orchestrator | 2026-01-06 02:53:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:35.669278 | orchestrator | 2026-01-06 02:53:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:35.669733 | orchestrator | 2026-01-06 02:53:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:38.717858 | orchestrator | 2026-01-06 02:53:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:38.719017 | orchestrator | 2026-01-06 02:53:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:38.719040 | orchestrator | 2026-01-06 02:53:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:41.765840 | orchestrator | 2026-01-06 02:53:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:41.767450 | orchestrator | 2026-01-06 02:53:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:41.767683 | orchestrator | 2026-01-06 02:53:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:44.812889 | orchestrator | 2026-01-06 02:53:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:44.813632 | orchestrator | 2026-01-06 02:53:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:44.813881 | orchestrator | 2026-01-06 02:53:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:47.865128 | orchestrator | 2026-01-06 02:53:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:47.866596 | orchestrator | 2026-01-06 02:53:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:47.866648 | orchestrator | 2026-01-06 02:53:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:50.917207 | orchestrator | 2026-01-06 02:53:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:50.920221 | orchestrator | 2026-01-06 02:53:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:50.920304 | orchestrator | 2026-01-06 02:53:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:53.968935 | orchestrator | 2026-01-06 02:53:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:53.969614 | orchestrator | 2026-01-06 02:53:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:53.969687 | orchestrator | 2026-01-06 02:53:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:53:57.016395 | orchestrator | 2026-01-06 02:53:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:53:57.019180 | orchestrator | 2026-01-06 02:53:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:53:57.019394 | orchestrator | 2026-01-06 02:53:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:00.064049 | orchestrator | 2026-01-06 02:54:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:00.065547 | orchestrator | 2026-01-06 02:54:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:00.065601 | orchestrator | 2026-01-06 02:54:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:03.109273 | orchestrator | 2026-01-06 02:54:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:03.110302 | orchestrator | 2026-01-06 02:54:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:03.110349 | orchestrator | 2026-01-06 02:54:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:06.150908 | orchestrator | 2026-01-06 02:54:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:06.152026 | orchestrator | 2026-01-06 02:54:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:06.152095 | orchestrator | 2026-01-06 02:54:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:09.205473 | orchestrator | 2026-01-06 02:54:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:09.207747 | orchestrator | 2026-01-06 02:54:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:09.207799 | orchestrator | 2026-01-06 02:54:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:12.259804 | orchestrator | 2026-01-06 02:54:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:12.262455 | orchestrator | 2026-01-06 02:54:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:12.262498 | orchestrator | 2026-01-06 02:54:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:15.308857 | orchestrator | 2026-01-06 02:54:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:15.310546 | orchestrator | 2026-01-06 02:54:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:15.310585 | orchestrator | 2026-01-06 02:54:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:18.359872 | orchestrator | 2026-01-06 02:54:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:18.361680 | orchestrator | 2026-01-06 02:54:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:18.361719 | orchestrator | 2026-01-06 02:54:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:21.415446 | orchestrator | 2026-01-06 02:54:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:21.417769 | orchestrator | 2026-01-06 02:54:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:21.417828 | orchestrator | 2026-01-06 02:54:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:24.458843 | orchestrator | 2026-01-06 02:54:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:24.460879 | orchestrator | 2026-01-06 02:54:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:24.461162 | orchestrator | 2026-01-06 02:54:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:27.510266 | orchestrator | 2026-01-06 02:54:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:27.511763 | orchestrator | 2026-01-06 02:54:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:27.511822 | orchestrator | 2026-01-06 02:54:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:30.563069 | orchestrator | 2026-01-06 02:54:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:30.565082 | orchestrator | 2026-01-06 02:54:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:30.565129 | orchestrator | 2026-01-06 02:54:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:33.615534 | orchestrator | 2026-01-06 02:54:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:33.618210 | orchestrator | 2026-01-06 02:54:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:33.618257 | orchestrator | 2026-01-06 02:54:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:36.662636 | orchestrator | 2026-01-06 02:54:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:36.666260 | orchestrator | 2026-01-06 02:54:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:36.666383 | orchestrator | 2026-01-06 02:54:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:39.711393 | orchestrator | 2026-01-06 02:54:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:39.713278 | orchestrator | 2026-01-06 02:54:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:39.713364 | orchestrator | 2026-01-06 02:54:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:42.765851 | orchestrator | 2026-01-06 02:54:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:42.767430 | orchestrator | 2026-01-06 02:54:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:42.767485 | orchestrator | 2026-01-06 02:54:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:45.816269 | orchestrator | 2026-01-06 02:54:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:45.817948 | orchestrator | 2026-01-06 02:54:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:45.818102 | orchestrator | 2026-01-06 02:54:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:48.865878 | orchestrator | 2026-01-06 02:54:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:48.868802 | orchestrator | 2026-01-06 02:54:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:48.868912 | orchestrator | 2026-01-06 02:54:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:51.909329 | orchestrator | 2026-01-06 02:54:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:51.910665 | orchestrator | 2026-01-06 02:54:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:51.910756 | orchestrator | 2026-01-06 02:54:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:54.960246 | orchestrator | 2026-01-06 02:54:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:54.961150 | orchestrator | 2026-01-06 02:54:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:54.961192 | orchestrator | 2026-01-06 02:54:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:54:58.011628 | orchestrator | 2026-01-06 02:54:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:54:58.013829 | orchestrator | 2026-01-06 02:54:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:54:58.013965 | orchestrator | 2026-01-06 02:54:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:01.059808 | orchestrator | 2026-01-06 02:55:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:01.061650 | orchestrator | 2026-01-06 02:55:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:01.061693 | orchestrator | 2026-01-06 02:55:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:04.109836 | orchestrator | 2026-01-06 02:55:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:04.110538 | orchestrator | 2026-01-06 02:55:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:04.110681 | orchestrator | 2026-01-06 02:55:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:07.155529 | orchestrator | 2026-01-06 02:55:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:07.157788 | orchestrator | 2026-01-06 02:55:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:07.157885 | orchestrator | 2026-01-06 02:55:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:10.202242 | orchestrator | 2026-01-06 02:55:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:10.204287 | orchestrator | 2026-01-06 02:55:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:10.204364 | orchestrator | 2026-01-06 02:55:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:13.254845 | orchestrator | 2026-01-06 02:55:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:13.256055 | orchestrator | 2026-01-06 02:55:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:13.256631 | orchestrator | 2026-01-06 02:55:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:16.305786 | orchestrator | 2026-01-06 02:55:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:16.306984 | orchestrator | 2026-01-06 02:55:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:16.307030 | orchestrator | 2026-01-06 02:55:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:19.345147 | orchestrator | 2026-01-06 02:55:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:19.347151 | orchestrator | 2026-01-06 02:55:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:19.347234 | orchestrator | 2026-01-06 02:55:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:22.390881 | orchestrator | 2026-01-06 02:55:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:22.394309 | orchestrator | 2026-01-06 02:55:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:22.394389 | orchestrator | 2026-01-06 02:55:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:25.438467 | orchestrator | 2026-01-06 02:55:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:25.440000 | orchestrator | 2026-01-06 02:55:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:25.440033 | orchestrator | 2026-01-06 02:55:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:28.487588 | orchestrator | 2026-01-06 02:55:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:28.489071 | orchestrator | 2026-01-06 02:55:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:28.489368 | orchestrator | 2026-01-06 02:55:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:31.539687 | orchestrator | 2026-01-06 02:55:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:31.541777 | orchestrator | 2026-01-06 02:55:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:31.541846 | orchestrator | 2026-01-06 02:55:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:34.587245 | orchestrator | 2026-01-06 02:55:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:34.589256 | orchestrator | 2026-01-06 02:55:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:34.589301 | orchestrator | 2026-01-06 02:55:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:37.636666 | orchestrator | 2026-01-06 02:55:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:37.640998 | orchestrator | 2026-01-06 02:55:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:37.641041 | orchestrator | 2026-01-06 02:55:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:40.686848 | orchestrator | 2026-01-06 02:55:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:40.689002 | orchestrator | 2026-01-06 02:55:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:40.689044 | orchestrator | 2026-01-06 02:55:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:43.740818 | orchestrator | 2026-01-06 02:55:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:43.743097 | orchestrator | 2026-01-06 02:55:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:43.743157 | orchestrator | 2026-01-06 02:55:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:46.792962 | orchestrator | 2026-01-06 02:55:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:46.793537 | orchestrator | 2026-01-06 02:55:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:46.793739 | orchestrator | 2026-01-06 02:55:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:49.844119 | orchestrator | 2026-01-06 02:55:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:49.846358 | orchestrator | 2026-01-06 02:55:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:49.846411 | orchestrator | 2026-01-06 02:55:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:52.895264 | orchestrator | 2026-01-06 02:55:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:52.896753 | orchestrator | 2026-01-06 02:55:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:52.896839 | orchestrator | 2026-01-06 02:55:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:55.939056 | orchestrator | 2026-01-06 02:55:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:55.940722 | orchestrator | 2026-01-06 02:55:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:55.941186 | orchestrator | 2026-01-06 02:55:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:55:58.990565 | orchestrator | 2026-01-06 02:55:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:55:58.992531 | orchestrator | 2026-01-06 02:55:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:55:58.992610 | orchestrator | 2026-01-06 02:55:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:02.041566 | orchestrator | 2026-01-06 02:56:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:02.043107 | orchestrator | 2026-01-06 02:56:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:02.043145 | orchestrator | 2026-01-06 02:56:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:05.082905 | orchestrator | 2026-01-06 02:56:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:05.084178 | orchestrator | 2026-01-06 02:56:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:05.084223 | orchestrator | 2026-01-06 02:56:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:08.131995 | orchestrator | 2026-01-06 02:56:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:08.132809 | orchestrator | 2026-01-06 02:56:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:08.132942 | orchestrator | 2026-01-06 02:56:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:11.181800 | orchestrator | 2026-01-06 02:56:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:11.182789 | orchestrator | 2026-01-06 02:56:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:11.182825 | orchestrator | 2026-01-06 02:56:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:14.225546 | orchestrator | 2026-01-06 02:56:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:14.226388 | orchestrator | 2026-01-06 02:56:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:14.226512 | orchestrator | 2026-01-06 02:56:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:17.275212 | orchestrator | 2026-01-06 02:56:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:17.277159 | orchestrator | 2026-01-06 02:56:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:17.277177 | orchestrator | 2026-01-06 02:56:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:20.323804 | orchestrator | 2026-01-06 02:56:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:20.325562 | orchestrator | 2026-01-06 02:56:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:20.325674 | orchestrator | 2026-01-06 02:56:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:23.373897 | orchestrator | 2026-01-06 02:56:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:23.376624 | orchestrator | 2026-01-06 02:56:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:23.376852 | orchestrator | 2026-01-06 02:56:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:26.425116 | orchestrator | 2026-01-06 02:56:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:26.426468 | orchestrator | 2026-01-06 02:56:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:26.426555 | orchestrator | 2026-01-06 02:56:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:29.478092 | orchestrator | 2026-01-06 02:56:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:29.479598 | orchestrator | 2026-01-06 02:56:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:29.479648 | orchestrator | 2026-01-06 02:56:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:32.528972 | orchestrator | 2026-01-06 02:56:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:32.530943 | orchestrator | 2026-01-06 02:56:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:32.531002 | orchestrator | 2026-01-06 02:56:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:35.579006 | orchestrator | 2026-01-06 02:56:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:35.580733 | orchestrator | 2026-01-06 02:56:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:35.580772 | orchestrator | 2026-01-06 02:56:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:38.625054 | orchestrator | 2026-01-06 02:56:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:38.626828 | orchestrator | 2026-01-06 02:56:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:38.626884 | orchestrator | 2026-01-06 02:56:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:41.672100 | orchestrator | 2026-01-06 02:56:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:41.674773 | orchestrator | 2026-01-06 02:56:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:41.674825 | orchestrator | 2026-01-06 02:56:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:44.714120 | orchestrator | 2026-01-06 02:56:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:44.715945 | orchestrator | 2026-01-06 02:56:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:44.716001 | orchestrator | 2026-01-06 02:56:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:47.765614 | orchestrator | 2026-01-06 02:56:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:47.766857 | orchestrator | 2026-01-06 02:56:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:47.766908 | orchestrator | 2026-01-06 02:56:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:50.818966 | orchestrator | 2026-01-06 02:56:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:50.822248 | orchestrator | 2026-01-06 02:56:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:50.822286 | orchestrator | 2026-01-06 02:56:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:53.873924 | orchestrator | 2026-01-06 02:56:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:53.877266 | orchestrator | 2026-01-06 02:56:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:53.877329 | orchestrator | 2026-01-06 02:56:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:56.930273 | orchestrator | 2026-01-06 02:56:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:56.933710 | orchestrator | 2026-01-06 02:56:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:56.933751 | orchestrator | 2026-01-06 02:56:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:56:59.982345 | orchestrator | 2026-01-06 02:56:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:56:59.983850 | orchestrator | 2026-01-06 02:56:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:56:59.984027 | orchestrator | 2026-01-06 02:56:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:03.037850 | orchestrator | 2026-01-06 02:57:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:03.039690 | orchestrator | 2026-01-06 02:57:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:03.039742 | orchestrator | 2026-01-06 02:57:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:06.086284 | orchestrator | 2026-01-06 02:57:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:06.086417 | orchestrator | 2026-01-06 02:57:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:06.086499 | orchestrator | 2026-01-06 02:57:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:09.132186 | orchestrator | 2026-01-06 02:57:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:09.133533 | orchestrator | 2026-01-06 02:57:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:09.133636 | orchestrator | 2026-01-06 02:57:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:12.180179 | orchestrator | 2026-01-06 02:57:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:12.182642 | orchestrator | 2026-01-06 02:57:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:12.182696 | orchestrator | 2026-01-06 02:57:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:15.233199 | orchestrator | 2026-01-06 02:57:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:15.234754 | orchestrator | 2026-01-06 02:57:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:15.234814 | orchestrator | 2026-01-06 02:57:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:18.279341 | orchestrator | 2026-01-06 02:57:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:18.282365 | orchestrator | 2026-01-06 02:57:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:18.282427 | orchestrator | 2026-01-06 02:57:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:21.331234 | orchestrator | 2026-01-06 02:57:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:21.332729 | orchestrator | 2026-01-06 02:57:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:21.332862 | orchestrator | 2026-01-06 02:57:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:24.378713 | orchestrator | 2026-01-06 02:57:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:24.381377 | orchestrator | 2026-01-06 02:57:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:24.381465 | orchestrator | 2026-01-06 02:57:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:27.431042 | orchestrator | 2026-01-06 02:57:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:27.432064 | orchestrator | 2026-01-06 02:57:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:27.432106 | orchestrator | 2026-01-06 02:57:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:30.477523 | orchestrator | 2026-01-06 02:57:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:30.479145 | orchestrator | 2026-01-06 02:57:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:30.479391 | orchestrator | 2026-01-06 02:57:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:33.531347 | orchestrator | 2026-01-06 02:57:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:33.533896 | orchestrator | 2026-01-06 02:57:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:33.534003 | orchestrator | 2026-01-06 02:57:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:36.579316 | orchestrator | 2026-01-06 02:57:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:36.580921 | orchestrator | 2026-01-06 02:57:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:36.581015 | orchestrator | 2026-01-06 02:57:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:39.629130 | orchestrator | 2026-01-06 02:57:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:39.631037 | orchestrator | 2026-01-06 02:57:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:39.631090 | orchestrator | 2026-01-06 02:57:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:42.678418 | orchestrator | 2026-01-06 02:57:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:42.680404 | orchestrator | 2026-01-06 02:57:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:42.680574 | orchestrator | 2026-01-06 02:57:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:45.722295 | orchestrator | 2026-01-06 02:57:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:45.723893 | orchestrator | 2026-01-06 02:57:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:45.723941 | orchestrator | 2026-01-06 02:57:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:48.770050 | orchestrator | 2026-01-06 02:57:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:48.772092 | orchestrator | 2026-01-06 02:57:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:48.772128 | orchestrator | 2026-01-06 02:57:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:51.825141 | orchestrator | 2026-01-06 02:57:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:51.826389 | orchestrator | 2026-01-06 02:57:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:51.826511 | orchestrator | 2026-01-06 02:57:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:54.881050 | orchestrator | 2026-01-06 02:57:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:54.881732 | orchestrator | 2026-01-06 02:57:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:54.881840 | orchestrator | 2026-01-06 02:57:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:57:57.931662 | orchestrator | 2026-01-06 02:57:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:57:57.933532 | orchestrator | 2026-01-06 02:57:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:57:57.933644 | orchestrator | 2026-01-06 02:57:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:00.989875 | orchestrator | 2026-01-06 02:58:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:00.991223 | orchestrator | 2026-01-06 02:58:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:00.991432 | orchestrator | 2026-01-06 02:58:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:04.042943 | orchestrator | 2026-01-06 02:58:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:04.048883 | orchestrator | 2026-01-06 02:58:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:04.048970 | orchestrator | 2026-01-06 02:58:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:07.089737 | orchestrator | 2026-01-06 02:58:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:07.092778 | orchestrator | 2026-01-06 02:58:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:07.092841 | orchestrator | 2026-01-06 02:58:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:10.132292 | orchestrator | 2026-01-06 02:58:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:10.133552 | orchestrator | 2026-01-06 02:58:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:10.133833 | orchestrator | 2026-01-06 02:58:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:13.183969 | orchestrator | 2026-01-06 02:58:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:13.184950 | orchestrator | 2026-01-06 02:58:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:13.184984 | orchestrator | 2026-01-06 02:58:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:16.235670 | orchestrator | 2026-01-06 02:58:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:16.236873 | orchestrator | 2026-01-06 02:58:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:16.237018 | orchestrator | 2026-01-06 02:58:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:19.285896 | orchestrator | 2026-01-06 02:58:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:19.286417 | orchestrator | 2026-01-06 02:58:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:19.286507 | orchestrator | 2026-01-06 02:58:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:22.333272 | orchestrator | 2026-01-06 02:58:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:22.335406 | orchestrator | 2026-01-06 02:58:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:22.335483 | orchestrator | 2026-01-06 02:58:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:25.384030 | orchestrator | 2026-01-06 02:58:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:25.385803 | orchestrator | 2026-01-06 02:58:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:25.385845 | orchestrator | 2026-01-06 02:58:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:28.429353 | orchestrator | 2026-01-06 02:58:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:28.431313 | orchestrator | 2026-01-06 02:58:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:28.431699 | orchestrator | 2026-01-06 02:58:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:31.479405 | orchestrator | 2026-01-06 02:58:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:31.481484 | orchestrator | 2026-01-06 02:58:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:31.481554 | orchestrator | 2026-01-06 02:58:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:34.527054 | orchestrator | 2026-01-06 02:58:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:34.528973 | orchestrator | 2026-01-06 02:58:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:34.529022 | orchestrator | 2026-01-06 02:58:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:37.576644 | orchestrator | 2026-01-06 02:58:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:37.577538 | orchestrator | 2026-01-06 02:58:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:37.577586 | orchestrator | 2026-01-06 02:58:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:40.622256 | orchestrator | 2026-01-06 02:58:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:40.622759 | orchestrator | 2026-01-06 02:58:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:40.622786 | orchestrator | 2026-01-06 02:58:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:43.667752 | orchestrator | 2026-01-06 02:58:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:43.668843 | orchestrator | 2026-01-06 02:58:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:43.668903 | orchestrator | 2026-01-06 02:58:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:46.711163 | orchestrator | 2026-01-06 02:58:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:46.712924 | orchestrator | 2026-01-06 02:58:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:46.713020 | orchestrator | 2026-01-06 02:58:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:49.761617 | orchestrator | 2026-01-06 02:58:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:49.762905 | orchestrator | 2026-01-06 02:58:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:49.763020 | orchestrator | 2026-01-06 02:58:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:52.815095 | orchestrator | 2026-01-06 02:58:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:52.816427 | orchestrator | 2026-01-06 02:58:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:52.816490 | orchestrator | 2026-01-06 02:58:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:55.865369 | orchestrator | 2026-01-06 02:58:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:55.867423 | orchestrator | 2026-01-06 02:58:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:55.867500 | orchestrator | 2026-01-06 02:58:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:58:58.910953 | orchestrator | 2026-01-06 02:58:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:58:58.912085 | orchestrator | 2026-01-06 02:58:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:58:58.912134 | orchestrator | 2026-01-06 02:58:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:01.963323 | orchestrator | 2026-01-06 02:59:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:01.965005 | orchestrator | 2026-01-06 02:59:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:01.965057 | orchestrator | 2026-01-06 02:59:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:05.025680 | orchestrator | 2026-01-06 02:59:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:05.028172 | orchestrator | 2026-01-06 02:59:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:05.028261 | orchestrator | 2026-01-06 02:59:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:08.080982 | orchestrator | 2026-01-06 02:59:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:08.083956 | orchestrator | 2026-01-06 02:59:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:08.084047 | orchestrator | 2026-01-06 02:59:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:11.132810 | orchestrator | 2026-01-06 02:59:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:11.137904 | orchestrator | 2026-01-06 02:59:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:11.138012 | orchestrator | 2026-01-06 02:59:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:14.180832 | orchestrator | 2026-01-06 02:59:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:14.181598 | orchestrator | 2026-01-06 02:59:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:14.181623 | orchestrator | 2026-01-06 02:59:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:17.233452 | orchestrator | 2026-01-06 02:59:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:17.234871 | orchestrator | 2026-01-06 02:59:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:17.235023 | orchestrator | 2026-01-06 02:59:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:20.283243 | orchestrator | 2026-01-06 02:59:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:20.285532 | orchestrator | 2026-01-06 02:59:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:20.285654 | orchestrator | 2026-01-06 02:59:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:23.332674 | orchestrator | 2026-01-06 02:59:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:23.333603 | orchestrator | 2026-01-06 02:59:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:23.333706 | orchestrator | 2026-01-06 02:59:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:26.377703 | orchestrator | 2026-01-06 02:59:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:26.379182 | orchestrator | 2026-01-06 02:59:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:26.379269 | orchestrator | 2026-01-06 02:59:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:29.427454 | orchestrator | 2026-01-06 02:59:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:29.428561 | orchestrator | 2026-01-06 02:59:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:29.429035 | orchestrator | 2026-01-06 02:59:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:32.478425 | orchestrator | 2026-01-06 02:59:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:32.480534 | orchestrator | 2026-01-06 02:59:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:32.480599 | orchestrator | 2026-01-06 02:59:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:35.524416 | orchestrator | 2026-01-06 02:59:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:35.525997 | orchestrator | 2026-01-06 02:59:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:35.526094 | orchestrator | 2026-01-06 02:59:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:38.566275 | orchestrator | 2026-01-06 02:59:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:38.567673 | orchestrator | 2026-01-06 02:59:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:38.567814 | orchestrator | 2026-01-06 02:59:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:41.609686 | orchestrator | 2026-01-06 02:59:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:41.611507 | orchestrator | 2026-01-06 02:59:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:41.611675 | orchestrator | 2026-01-06 02:59:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:44.657694 | orchestrator | 2026-01-06 02:59:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:44.658986 | orchestrator | 2026-01-06 02:59:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:44.659107 | orchestrator | 2026-01-06 02:59:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:47.703889 | orchestrator | 2026-01-06 02:59:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:47.705639 | orchestrator | 2026-01-06 02:59:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:47.705694 | orchestrator | 2026-01-06 02:59:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:50.746621 | orchestrator | 2026-01-06 02:59:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:50.749252 | orchestrator | 2026-01-06 02:59:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:50.749324 | orchestrator | 2026-01-06 02:59:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:53.794482 | orchestrator | 2026-01-06 02:59:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:53.796164 | orchestrator | 2026-01-06 02:59:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:53.796226 | orchestrator | 2026-01-06 02:59:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:56.847061 | orchestrator | 2026-01-06 02:59:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:56.849107 | orchestrator | 2026-01-06 02:59:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:56.849150 | orchestrator | 2026-01-06 02:59:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 02:59:59.899670 | orchestrator | 2026-01-06 02:59:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 02:59:59.900382 | orchestrator | 2026-01-06 02:59:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 02:59:59.900897 | orchestrator | 2026-01-06 02:59:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:02.955790 | orchestrator | 2026-01-06 03:00:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:02.958367 | orchestrator | 2026-01-06 03:00:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:02.958450 | orchestrator | 2026-01-06 03:00:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:06.014812 | orchestrator | 2026-01-06 03:00:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:06.016555 | orchestrator | 2026-01-06 03:00:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:06.016604 | orchestrator | 2026-01-06 03:00:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:09.068738 | orchestrator | 2026-01-06 03:00:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:09.071311 | orchestrator | 2026-01-06 03:00:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:09.071418 | orchestrator | 2026-01-06 03:00:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:12.127699 | orchestrator | 2026-01-06 03:00:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:12.130474 | orchestrator | 2026-01-06 03:00:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:12.130526 | orchestrator | 2026-01-06 03:00:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:15.174952 | orchestrator | 2026-01-06 03:00:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:15.176334 | orchestrator | 2026-01-06 03:00:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:15.176399 | orchestrator | 2026-01-06 03:00:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:18.228086 | orchestrator | 2026-01-06 03:00:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:18.229538 | orchestrator | 2026-01-06 03:00:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:18.229653 | orchestrator | 2026-01-06 03:00:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:21.276809 | orchestrator | 2026-01-06 03:00:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:21.277601 | orchestrator | 2026-01-06 03:00:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:21.277756 | orchestrator | 2026-01-06 03:00:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:24.327621 | orchestrator | 2026-01-06 03:00:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:24.329051 | orchestrator | 2026-01-06 03:00:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:24.329069 | orchestrator | 2026-01-06 03:00:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:27.380705 | orchestrator | 2026-01-06 03:00:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:27.382710 | orchestrator | 2026-01-06 03:00:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:27.382762 | orchestrator | 2026-01-06 03:00:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:30.430836 | orchestrator | 2026-01-06 03:00:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:30.431280 | orchestrator | 2026-01-06 03:00:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:30.431301 | orchestrator | 2026-01-06 03:00:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:33.483789 | orchestrator | 2026-01-06 03:00:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:33.484371 | orchestrator | 2026-01-06 03:00:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:33.484415 | orchestrator | 2026-01-06 03:00:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:36.532006 | orchestrator | 2026-01-06 03:00:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:36.533961 | orchestrator | 2026-01-06 03:00:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:36.534004 | orchestrator | 2026-01-06 03:00:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:39.578812 | orchestrator | 2026-01-06 03:00:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:39.580158 | orchestrator | 2026-01-06 03:00:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:39.580775 | orchestrator | 2026-01-06 03:00:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:42.628620 | orchestrator | 2026-01-06 03:00:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:42.629938 | orchestrator | 2026-01-06 03:00:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:42.630132 | orchestrator | 2026-01-06 03:00:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:45.683812 | orchestrator | 2026-01-06 03:00:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:45.684611 | orchestrator | 2026-01-06 03:00:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:45.684644 | orchestrator | 2026-01-06 03:00:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:48.738652 | orchestrator | 2026-01-06 03:00:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:48.739436 | orchestrator | 2026-01-06 03:00:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:48.739497 | orchestrator | 2026-01-06 03:00:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:51.781798 | orchestrator | 2026-01-06 03:00:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:51.783789 | orchestrator | 2026-01-06 03:00:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:51.783857 | orchestrator | 2026-01-06 03:00:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:54.829979 | orchestrator | 2026-01-06 03:00:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:54.832383 | orchestrator | 2026-01-06 03:00:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:54.832425 | orchestrator | 2026-01-06 03:00:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:00:57.876800 | orchestrator | 2026-01-06 03:00:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:00:57.878266 | orchestrator | 2026-01-06 03:00:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:00:57.878372 | orchestrator | 2026-01-06 03:00:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:00.922817 | orchestrator | 2026-01-06 03:01:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:00.925023 | orchestrator | 2026-01-06 03:01:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:00.925057 | orchestrator | 2026-01-06 03:01:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:03.978217 | orchestrator | 2026-01-06 03:01:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:03.979850 | orchestrator | 2026-01-06 03:01:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:03.980056 | orchestrator | 2026-01-06 03:01:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:07.031633 | orchestrator | 2026-01-06 03:01:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:07.033576 | orchestrator | 2026-01-06 03:01:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:07.033734 | orchestrator | 2026-01-06 03:01:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:10.072714 | orchestrator | 2026-01-06 03:01:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:10.074679 | orchestrator | 2026-01-06 03:01:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:10.074853 | orchestrator | 2026-01-06 03:01:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:13.125427 | orchestrator | 2026-01-06 03:01:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:13.126483 | orchestrator | 2026-01-06 03:01:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:13.126515 | orchestrator | 2026-01-06 03:01:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:16.176630 | orchestrator | 2026-01-06 03:01:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:16.177194 | orchestrator | 2026-01-06 03:01:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:16.177401 | orchestrator | 2026-01-06 03:01:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:19.223584 | orchestrator | 2026-01-06 03:01:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:19.225925 | orchestrator | 2026-01-06 03:01:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:19.226114 | orchestrator | 2026-01-06 03:01:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:22.270812 | orchestrator | 2026-01-06 03:01:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:22.271574 | orchestrator | 2026-01-06 03:01:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:22.271654 | orchestrator | 2026-01-06 03:01:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:25.319448 | orchestrator | 2026-01-06 03:01:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:25.322001 | orchestrator | 2026-01-06 03:01:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:25.322148 | orchestrator | 2026-01-06 03:01:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:28.373782 | orchestrator | 2026-01-06 03:01:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:28.375581 | orchestrator | 2026-01-06 03:01:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:28.375663 | orchestrator | 2026-01-06 03:01:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:31.415722 | orchestrator | 2026-01-06 03:01:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:31.417613 | orchestrator | 2026-01-06 03:01:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:31.417791 | orchestrator | 2026-01-06 03:01:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:34.459428 | orchestrator | 2026-01-06 03:01:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:34.461151 | orchestrator | 2026-01-06 03:01:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:34.461194 | orchestrator | 2026-01-06 03:01:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:37.503920 | orchestrator | 2026-01-06 03:01:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:37.506212 | orchestrator | 2026-01-06 03:01:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:37.506282 | orchestrator | 2026-01-06 03:01:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:40.551476 | orchestrator | 2026-01-06 03:01:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:40.553755 | orchestrator | 2026-01-06 03:01:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:40.553851 | orchestrator | 2026-01-06 03:01:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:43.600974 | orchestrator | 2026-01-06 03:01:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:43.602348 | orchestrator | 2026-01-06 03:01:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:43.602385 | orchestrator | 2026-01-06 03:01:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:46.645763 | orchestrator | 2026-01-06 03:01:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:46.646583 | orchestrator | 2026-01-06 03:01:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:46.646640 | orchestrator | 2026-01-06 03:01:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:49.693154 | orchestrator | 2026-01-06 03:01:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:49.695976 | orchestrator | 2026-01-06 03:01:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:49.696273 | orchestrator | 2026-01-06 03:01:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:52.741551 | orchestrator | 2026-01-06 03:01:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:52.743243 | orchestrator | 2026-01-06 03:01:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:52.743278 | orchestrator | 2026-01-06 03:01:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:55.786216 | orchestrator | 2026-01-06 03:01:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:55.787694 | orchestrator | 2026-01-06 03:01:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:55.787983 | orchestrator | 2026-01-06 03:01:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:01:58.843237 | orchestrator | 2026-01-06 03:01:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:01:58.844285 | orchestrator | 2026-01-06 03:01:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:01:58.844344 | orchestrator | 2026-01-06 03:01:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:01.894616 | orchestrator | 2026-01-06 03:02:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:01.897242 | orchestrator | 2026-01-06 03:02:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:01.897324 | orchestrator | 2026-01-06 03:02:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:04.949983 | orchestrator | 2026-01-06 03:02:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:04.951285 | orchestrator | 2026-01-06 03:02:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:04.951844 | orchestrator | 2026-01-06 03:02:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:07.999006 | orchestrator | 2026-01-06 03:02:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:08.002296 | orchestrator | 2026-01-06 03:02:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:08.002356 | orchestrator | 2026-01-06 03:02:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:11.055623 | orchestrator | 2026-01-06 03:02:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:11.057450 | orchestrator | 2026-01-06 03:02:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:11.057504 | orchestrator | 2026-01-06 03:02:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:14.109793 | orchestrator | 2026-01-06 03:02:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:14.111137 | orchestrator | 2026-01-06 03:02:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:14.111263 | orchestrator | 2026-01-06 03:02:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:17.167347 | orchestrator | 2026-01-06 03:02:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:17.168859 | orchestrator | 2026-01-06 03:02:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:17.168902 | orchestrator | 2026-01-06 03:02:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:20.221395 | orchestrator | 2026-01-06 03:02:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:20.223485 | orchestrator | 2026-01-06 03:02:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:20.223551 | orchestrator | 2026-01-06 03:02:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:23.271639 | orchestrator | 2026-01-06 03:02:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:23.275829 | orchestrator | 2026-01-06 03:02:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:23.275928 | orchestrator | 2026-01-06 03:02:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:26.324896 | orchestrator | 2026-01-06 03:02:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:26.326560 | orchestrator | 2026-01-06 03:02:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:26.326603 | orchestrator | 2026-01-06 03:02:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:29.374554 | orchestrator | 2026-01-06 03:02:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:29.377694 | orchestrator | 2026-01-06 03:02:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:29.377843 | orchestrator | 2026-01-06 03:02:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:32.428535 | orchestrator | 2026-01-06 03:02:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:32.430411 | orchestrator | 2026-01-06 03:02:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:32.430447 | orchestrator | 2026-01-06 03:02:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:35.486531 | orchestrator | 2026-01-06 03:02:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:35.487177 | orchestrator | 2026-01-06 03:02:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:35.487633 | orchestrator | 2026-01-06 03:02:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:38.527973 | orchestrator | 2026-01-06 03:02:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:38.529547 | orchestrator | 2026-01-06 03:02:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:38.529617 | orchestrator | 2026-01-06 03:02:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:41.578579 | orchestrator | 2026-01-06 03:02:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:41.579246 | orchestrator | 2026-01-06 03:02:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:41.579310 | orchestrator | 2026-01-06 03:02:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:44.631626 | orchestrator | 2026-01-06 03:02:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:44.633371 | orchestrator | 2026-01-06 03:02:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:44.633740 | orchestrator | 2026-01-06 03:02:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:47.685166 | orchestrator | 2026-01-06 03:02:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:47.685956 | orchestrator | 2026-01-06 03:02:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:47.686006 | orchestrator | 2026-01-06 03:02:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:50.735359 | orchestrator | 2026-01-06 03:02:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:50.736524 | orchestrator | 2026-01-06 03:02:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:50.736627 | orchestrator | 2026-01-06 03:02:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:53.785252 | orchestrator | 2026-01-06 03:02:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:53.787305 | orchestrator | 2026-01-06 03:02:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:53.787376 | orchestrator | 2026-01-06 03:02:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:56.842883 | orchestrator | 2026-01-06 03:02:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:56.843895 | orchestrator | 2026-01-06 03:02:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:56.843961 | orchestrator | 2026-01-06 03:02:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:02:59.886720 | orchestrator | 2026-01-06 03:02:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:02:59.888310 | orchestrator | 2026-01-06 03:02:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:02:59.888355 | orchestrator | 2026-01-06 03:02:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:02.935712 | orchestrator | 2026-01-06 03:03:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:02.938381 | orchestrator | 2026-01-06 03:03:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:02.938448 | orchestrator | 2026-01-06 03:03:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:05.989627 | orchestrator | 2026-01-06 03:03:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:05.991177 | orchestrator | 2026-01-06 03:03:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:05.991236 | orchestrator | 2026-01-06 03:03:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:09.051038 | orchestrator | 2026-01-06 03:03:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:09.051168 | orchestrator | 2026-01-06 03:03:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:09.051272 | orchestrator | 2026-01-06 03:03:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:12.099405 | orchestrator | 2026-01-06 03:03:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:12.101439 | orchestrator | 2026-01-06 03:03:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:12.101536 | orchestrator | 2026-01-06 03:03:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:15.148881 | orchestrator | 2026-01-06 03:03:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:15.149865 | orchestrator | 2026-01-06 03:03:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:15.149904 | orchestrator | 2026-01-06 03:03:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:18.200536 | orchestrator | 2026-01-06 03:03:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:18.202506 | orchestrator | 2026-01-06 03:03:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:18.202550 | orchestrator | 2026-01-06 03:03:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:21.255040 | orchestrator | 2026-01-06 03:03:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:21.257037 | orchestrator | 2026-01-06 03:03:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:21.257154 | orchestrator | 2026-01-06 03:03:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:24.311022 | orchestrator | 2026-01-06 03:03:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:24.311767 | orchestrator | 2026-01-06 03:03:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:24.311897 | orchestrator | 2026-01-06 03:03:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:27.369869 | orchestrator | 2026-01-06 03:03:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:27.371532 | orchestrator | 2026-01-06 03:03:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:27.371875 | orchestrator | 2026-01-06 03:03:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:30.417326 | orchestrator | 2026-01-06 03:03:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:30.418480 | orchestrator | 2026-01-06 03:03:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:30.418534 | orchestrator | 2026-01-06 03:03:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:33.467672 | orchestrator | 2026-01-06 03:03:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:33.469252 | orchestrator | 2026-01-06 03:03:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:33.469305 | orchestrator | 2026-01-06 03:03:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:36.520219 | orchestrator | 2026-01-06 03:03:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:36.521429 | orchestrator | 2026-01-06 03:03:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:36.521611 | orchestrator | 2026-01-06 03:03:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:39.565522 | orchestrator | 2026-01-06 03:03:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:39.566607 | orchestrator | 2026-01-06 03:03:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:39.566663 | orchestrator | 2026-01-06 03:03:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:42.608559 | orchestrator | 2026-01-06 03:03:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:42.608742 | orchestrator | 2026-01-06 03:03:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:42.608764 | orchestrator | 2026-01-06 03:03:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:45.654638 | orchestrator | 2026-01-06 03:03:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:45.656106 | orchestrator | 2026-01-06 03:03:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:45.656227 | orchestrator | 2026-01-06 03:03:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:48.701752 | orchestrator | 2026-01-06 03:03:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:48.703356 | orchestrator | 2026-01-06 03:03:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:48.703394 | orchestrator | 2026-01-06 03:03:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:51.752798 | orchestrator | 2026-01-06 03:03:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:51.753988 | orchestrator | 2026-01-06 03:03:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:51.754078 | orchestrator | 2026-01-06 03:03:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:54.803475 | orchestrator | 2026-01-06 03:03:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:54.804632 | orchestrator | 2026-01-06 03:03:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:54.804711 | orchestrator | 2026-01-06 03:03:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:03:57.851247 | orchestrator | 2026-01-06 03:03:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:03:57.852535 | orchestrator | 2026-01-06 03:03:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:03:57.852596 | orchestrator | 2026-01-06 03:03:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:00.899089 | orchestrator | 2026-01-06 03:04:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:00.900664 | orchestrator | 2026-01-06 03:04:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:00.901195 | orchestrator | 2026-01-06 03:04:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:03.946752 | orchestrator | 2026-01-06 03:04:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:03.948262 | orchestrator | 2026-01-06 03:04:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:03.948279 | orchestrator | 2026-01-06 03:04:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:06.994066 | orchestrator | 2026-01-06 03:04:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:06.994845 | orchestrator | 2026-01-06 03:04:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:06.994970 | orchestrator | 2026-01-06 03:04:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:10.040526 | orchestrator | 2026-01-06 03:04:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:10.042152 | orchestrator | 2026-01-06 03:04:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:10.042277 | orchestrator | 2026-01-06 03:04:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:13.100428 | orchestrator | 2026-01-06 03:04:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:13.100531 | orchestrator | 2026-01-06 03:04:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:13.100616 | orchestrator | 2026-01-06 03:04:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:16.132899 | orchestrator | 2026-01-06 03:04:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:16.134153 | orchestrator | 2026-01-06 03:04:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:16.134214 | orchestrator | 2026-01-06 03:04:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:19.174248 | orchestrator | 2026-01-06 03:04:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:19.175417 | orchestrator | 2026-01-06 03:04:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:19.175480 | orchestrator | 2026-01-06 03:04:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:22.220698 | orchestrator | 2026-01-06 03:04:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:22.222257 | orchestrator | 2026-01-06 03:04:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:22.222312 | orchestrator | 2026-01-06 03:04:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:25.263705 | orchestrator | 2026-01-06 03:04:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:25.271413 | orchestrator | 2026-01-06 03:04:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:25.271524 | orchestrator | 2026-01-06 03:04:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:28.315514 | orchestrator | 2026-01-06 03:04:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:28.317934 | orchestrator | 2026-01-06 03:04:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:28.318144 | orchestrator | 2026-01-06 03:04:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:31.369474 | orchestrator | 2026-01-06 03:04:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:31.370421 | orchestrator | 2026-01-06 03:04:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:31.370454 | orchestrator | 2026-01-06 03:04:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:34.411690 | orchestrator | 2026-01-06 03:04:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:34.413601 | orchestrator | 2026-01-06 03:04:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:34.413646 | orchestrator | 2026-01-06 03:04:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:37.455354 | orchestrator | 2026-01-06 03:04:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:37.456256 | orchestrator | 2026-01-06 03:04:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:37.456389 | orchestrator | 2026-01-06 03:04:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:40.502760 | orchestrator | 2026-01-06 03:04:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:40.504532 | orchestrator | 2026-01-06 03:04:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:40.504588 | orchestrator | 2026-01-06 03:04:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:43.544847 | orchestrator | 2026-01-06 03:04:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:43.547025 | orchestrator | 2026-01-06 03:04:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:43.547087 | orchestrator | 2026-01-06 03:04:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:46.599697 | orchestrator | 2026-01-06 03:04:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:46.599936 | orchestrator | 2026-01-06 03:04:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:46.599959 | orchestrator | 2026-01-06 03:04:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:49.652553 | orchestrator | 2026-01-06 03:04:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:49.653241 | orchestrator | 2026-01-06 03:04:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:49.653279 | orchestrator | 2026-01-06 03:04:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:52.703482 | orchestrator | 2026-01-06 03:04:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:52.704666 | orchestrator | 2026-01-06 03:04:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:52.705116 | orchestrator | 2026-01-06 03:04:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:55.747885 | orchestrator | 2026-01-06 03:04:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:55.748722 | orchestrator | 2026-01-06 03:04:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:55.748788 | orchestrator | 2026-01-06 03:04:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:04:58.792478 | orchestrator | 2026-01-06 03:04:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:04:58.794269 | orchestrator | 2026-01-06 03:04:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:04:58.794539 | orchestrator | 2026-01-06 03:04:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:01.845300 | orchestrator | 2026-01-06 03:05:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:01.847095 | orchestrator | 2026-01-06 03:05:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:01.847453 | orchestrator | 2026-01-06 03:05:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:04.896560 | orchestrator | 2026-01-06 03:05:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:04.896954 | orchestrator | 2026-01-06 03:05:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:04.897621 | orchestrator | 2026-01-06 03:05:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:07.946610 | orchestrator | 2026-01-06 03:05:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:07.948048 | orchestrator | 2026-01-06 03:05:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:07.948374 | orchestrator | 2026-01-06 03:05:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:10.993486 | orchestrator | 2026-01-06 03:05:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:10.994741 | orchestrator | 2026-01-06 03:05:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:10.994785 | orchestrator | 2026-01-06 03:05:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:14.032001 | orchestrator | 2026-01-06 03:05:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:14.032547 | orchestrator | 2026-01-06 03:05:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:14.035541 | orchestrator | 2026-01-06 03:05:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:17.075288 | orchestrator | 2026-01-06 03:05:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:17.077455 | orchestrator | 2026-01-06 03:05:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:17.078255 | orchestrator | 2026-01-06 03:05:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:20.124697 | orchestrator | 2026-01-06 03:05:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:20.126182 | orchestrator | 2026-01-06 03:05:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:20.126276 | orchestrator | 2026-01-06 03:05:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:23.191773 | orchestrator | 2026-01-06 03:05:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:23.194980 | orchestrator | 2026-01-06 03:05:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:23.195089 | orchestrator | 2026-01-06 03:05:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:26.247884 | orchestrator | 2026-01-06 03:05:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:26.249506 | orchestrator | 2026-01-06 03:05:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:26.249567 | orchestrator | 2026-01-06 03:05:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:29.291613 | orchestrator | 2026-01-06 03:05:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:29.294504 | orchestrator | 2026-01-06 03:05:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:29.294649 | orchestrator | 2026-01-06 03:05:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:32.339922 | orchestrator | 2026-01-06 03:05:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:32.340265 | orchestrator | 2026-01-06 03:05:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:32.340289 | orchestrator | 2026-01-06 03:05:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:35.385884 | orchestrator | 2026-01-06 03:05:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:35.387499 | orchestrator | 2026-01-06 03:05:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:35.387634 | orchestrator | 2026-01-06 03:05:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:38.434132 | orchestrator | 2026-01-06 03:05:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:38.434225 | orchestrator | 2026-01-06 03:05:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:38.434283 | orchestrator | 2026-01-06 03:05:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:41.494481 | orchestrator | 2026-01-06 03:05:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:41.495683 | orchestrator | 2026-01-06 03:05:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:41.496502 | orchestrator | 2026-01-06 03:05:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:44.551288 | orchestrator | 2026-01-06 03:05:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:44.551709 | orchestrator | 2026-01-06 03:05:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:44.552387 | orchestrator | 2026-01-06 03:05:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:47.603532 | orchestrator | 2026-01-06 03:05:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:47.604018 | orchestrator | 2026-01-06 03:05:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:47.604065 | orchestrator | 2026-01-06 03:05:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:50.657904 | orchestrator | 2026-01-06 03:05:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:50.658773 | orchestrator | 2026-01-06 03:05:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:50.658831 | orchestrator | 2026-01-06 03:05:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:53.701771 | orchestrator | 2026-01-06 03:05:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:53.702753 | orchestrator | 2026-01-06 03:05:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:53.702777 | orchestrator | 2026-01-06 03:05:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:56.751299 | orchestrator | 2026-01-06 03:05:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:56.752421 | orchestrator | 2026-01-06 03:05:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:56.752595 | orchestrator | 2026-01-06 03:05:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:05:59.800222 | orchestrator | 2026-01-06 03:05:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:05:59.802842 | orchestrator | 2026-01-06 03:05:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:05:59.802899 | orchestrator | 2026-01-06 03:05:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:02.848502 | orchestrator | 2026-01-06 03:06:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:02.850930 | orchestrator | 2026-01-06 03:06:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:02.850969 | orchestrator | 2026-01-06 03:06:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:05.900973 | orchestrator | 2026-01-06 03:06:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:05.903780 | orchestrator | 2026-01-06 03:06:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:05.903838 | orchestrator | 2026-01-06 03:06:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:08.959835 | orchestrator | 2026-01-06 03:06:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:08.961450 | orchestrator | 2026-01-06 03:06:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:08.961510 | orchestrator | 2026-01-06 03:06:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:12.010902 | orchestrator | 2026-01-06 03:06:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:12.011931 | orchestrator | 2026-01-06 03:06:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:12.012042 | orchestrator | 2026-01-06 03:06:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:15.071685 | orchestrator | 2026-01-06 03:06:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:15.071876 | orchestrator | 2026-01-06 03:06:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:15.071894 | orchestrator | 2026-01-06 03:06:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:18.118527 | orchestrator | 2026-01-06 03:06:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:18.118807 | orchestrator | 2026-01-06 03:06:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:18.118838 | orchestrator | 2026-01-06 03:06:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:21.172460 | orchestrator | 2026-01-06 03:06:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:21.174933 | orchestrator | 2026-01-06 03:06:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:21.175099 | orchestrator | 2026-01-06 03:06:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:24.220009 | orchestrator | 2026-01-06 03:06:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:24.222947 | orchestrator | 2026-01-06 03:06:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:24.223022 | orchestrator | 2026-01-06 03:06:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:27.270571 | orchestrator | 2026-01-06 03:06:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:27.272559 | orchestrator | 2026-01-06 03:06:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:27.272637 | orchestrator | 2026-01-06 03:06:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:30.319877 | orchestrator | 2026-01-06 03:06:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:30.321501 | orchestrator | 2026-01-06 03:06:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:30.321621 | orchestrator | 2026-01-06 03:06:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:33.362968 | orchestrator | 2026-01-06 03:06:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:33.363219 | orchestrator | 2026-01-06 03:06:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:33.363240 | orchestrator | 2026-01-06 03:06:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:36.407509 | orchestrator | 2026-01-06 03:06:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:36.408211 | orchestrator | 2026-01-06 03:06:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:36.408356 | orchestrator | 2026-01-06 03:06:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:39.457954 | orchestrator | 2026-01-06 03:06:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:39.459599 | orchestrator | 2026-01-06 03:06:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:39.459648 | orchestrator | 2026-01-06 03:06:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:42.512360 | orchestrator | 2026-01-06 03:06:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:42.514490 | orchestrator | 2026-01-06 03:06:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:42.514537 | orchestrator | 2026-01-06 03:06:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:45.567883 | orchestrator | 2026-01-06 03:06:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:45.568471 | orchestrator | 2026-01-06 03:06:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:45.568533 | orchestrator | 2026-01-06 03:06:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:48.614388 | orchestrator | 2026-01-06 03:06:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:48.616771 | orchestrator | 2026-01-06 03:06:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:48.616808 | orchestrator | 2026-01-06 03:06:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:51.675484 | orchestrator | 2026-01-06 03:06:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:51.677176 | orchestrator | 2026-01-06 03:06:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:51.677247 | orchestrator | 2026-01-06 03:06:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:54.729353 | orchestrator | 2026-01-06 03:06:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:54.731885 | orchestrator | 2026-01-06 03:06:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:54.731945 | orchestrator | 2026-01-06 03:06:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:06:57.787523 | orchestrator | 2026-01-06 03:06:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:06:57.790710 | orchestrator | 2026-01-06 03:06:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:06:57.790804 | orchestrator | 2026-01-06 03:06:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:00.839638 | orchestrator | 2026-01-06 03:07:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:00.842539 | orchestrator | 2026-01-06 03:07:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:00.842725 | orchestrator | 2026-01-06 03:07:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:03.895373 | orchestrator | 2026-01-06 03:07:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:03.897838 | orchestrator | 2026-01-06 03:07:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:03.897896 | orchestrator | 2026-01-06 03:07:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:06.948026 | orchestrator | 2026-01-06 03:07:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:06.948928 | orchestrator | 2026-01-06 03:07:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:06.948961 | orchestrator | 2026-01-06 03:07:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:09.992290 | orchestrator | 2026-01-06 03:07:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:09.994257 | orchestrator | 2026-01-06 03:07:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:09.994362 | orchestrator | 2026-01-06 03:07:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:13.042000 | orchestrator | 2026-01-06 03:07:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:13.043723 | orchestrator | 2026-01-06 03:07:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:13.043798 | orchestrator | 2026-01-06 03:07:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:16.095862 | orchestrator | 2026-01-06 03:07:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:16.096889 | orchestrator | 2026-01-06 03:07:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:16.097231 | orchestrator | 2026-01-06 03:07:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:19.140576 | orchestrator | 2026-01-06 03:07:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:19.142159 | orchestrator | 2026-01-06 03:07:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:19.142212 | orchestrator | 2026-01-06 03:07:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:22.193869 | orchestrator | 2026-01-06 03:07:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:22.196199 | orchestrator | 2026-01-06 03:07:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:22.196276 | orchestrator | 2026-01-06 03:07:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:25.250333 | orchestrator | 2026-01-06 03:07:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:25.251435 | orchestrator | 2026-01-06 03:07:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:25.251595 | orchestrator | 2026-01-06 03:07:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:28.306214 | orchestrator | 2026-01-06 03:07:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:28.306463 | orchestrator | 2026-01-06 03:07:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:28.306498 | orchestrator | 2026-01-06 03:07:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:31.352852 | orchestrator | 2026-01-06 03:07:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:31.354109 | orchestrator | 2026-01-06 03:07:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:31.354256 | orchestrator | 2026-01-06 03:07:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:34.402081 | orchestrator | 2026-01-06 03:07:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:34.402997 | orchestrator | 2026-01-06 03:07:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:34.403023 | orchestrator | 2026-01-06 03:07:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:37.446765 | orchestrator | 2026-01-06 03:07:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:37.448782 | orchestrator | 2026-01-06 03:07:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:37.448896 | orchestrator | 2026-01-06 03:07:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:40.503374 | orchestrator | 2026-01-06 03:07:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:40.504858 | orchestrator | 2026-01-06 03:07:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:40.504935 | orchestrator | 2026-01-06 03:07:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:43.544584 | orchestrator | 2026-01-06 03:07:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:43.547373 | orchestrator | 2026-01-06 03:07:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:43.547439 | orchestrator | 2026-01-06 03:07:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:46.593163 | orchestrator | 2026-01-06 03:07:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:46.593969 | orchestrator | 2026-01-06 03:07:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:46.593995 | orchestrator | 2026-01-06 03:07:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:49.640442 | orchestrator | 2026-01-06 03:07:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:49.641726 | orchestrator | 2026-01-06 03:07:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:49.641771 | orchestrator | 2026-01-06 03:07:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:52.691222 | orchestrator | 2026-01-06 03:07:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:52.693618 | orchestrator | 2026-01-06 03:07:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:52.693772 | orchestrator | 2026-01-06 03:07:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:55.743467 | orchestrator | 2026-01-06 03:07:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:55.745423 | orchestrator | 2026-01-06 03:07:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:55.745513 | orchestrator | 2026-01-06 03:07:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:07:58.794537 | orchestrator | 2026-01-06 03:07:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:07:58.796287 | orchestrator | 2026-01-06 03:07:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:07:58.796507 | orchestrator | 2026-01-06 03:07:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:01.843250 | orchestrator | 2026-01-06 03:08:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:01.845445 | orchestrator | 2026-01-06 03:08:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:01.845497 | orchestrator | 2026-01-06 03:08:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:04.893048 | orchestrator | 2026-01-06 03:08:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:04.894910 | orchestrator | 2026-01-06 03:08:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:04.894956 | orchestrator | 2026-01-06 03:08:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:07.942115 | orchestrator | 2026-01-06 03:08:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:07.944002 | orchestrator | 2026-01-06 03:08:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:07.944116 | orchestrator | 2026-01-06 03:08:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:10.986748 | orchestrator | 2026-01-06 03:08:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:10.987906 | orchestrator | 2026-01-06 03:08:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:10.987972 | orchestrator | 2026-01-06 03:08:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:14.040315 | orchestrator | 2026-01-06 03:08:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:14.040586 | orchestrator | 2026-01-06 03:08:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:14.040983 | orchestrator | 2026-01-06 03:08:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:17.091504 | orchestrator | 2026-01-06 03:08:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:17.093860 | orchestrator | 2026-01-06 03:08:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:17.093936 | orchestrator | 2026-01-06 03:08:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:20.140020 | orchestrator | 2026-01-06 03:08:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:20.142828 | orchestrator | 2026-01-06 03:08:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:20.142891 | orchestrator | 2026-01-06 03:08:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:23.193708 | orchestrator | 2026-01-06 03:08:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:23.194623 | orchestrator | 2026-01-06 03:08:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:23.194681 | orchestrator | 2026-01-06 03:08:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:26.250849 | orchestrator | 2026-01-06 03:08:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:26.252644 | orchestrator | 2026-01-06 03:08:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:26.252723 | orchestrator | 2026-01-06 03:08:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:29.299506 | orchestrator | 2026-01-06 03:08:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:29.300658 | orchestrator | 2026-01-06 03:08:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:29.300741 | orchestrator | 2026-01-06 03:08:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:32.348393 | orchestrator | 2026-01-06 03:08:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:32.350082 | orchestrator | 2026-01-06 03:08:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:32.350153 | orchestrator | 2026-01-06 03:08:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:35.393583 | orchestrator | 2026-01-06 03:08:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:35.394589 | orchestrator | 2026-01-06 03:08:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:35.394625 | orchestrator | 2026-01-06 03:08:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:38.446905 | orchestrator | 2026-01-06 03:08:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:38.447069 | orchestrator | 2026-01-06 03:08:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:38.447081 | orchestrator | 2026-01-06 03:08:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:41.500903 | orchestrator | 2026-01-06 03:08:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:41.502584 | orchestrator | 2026-01-06 03:08:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:41.502646 | orchestrator | 2026-01-06 03:08:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:44.553199 | orchestrator | 2026-01-06 03:08:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:44.554261 | orchestrator | 2026-01-06 03:08:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:44.554311 | orchestrator | 2026-01-06 03:08:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:47.597632 | orchestrator | 2026-01-06 03:08:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:47.598751 | orchestrator | 2026-01-06 03:08:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:47.598789 | orchestrator | 2026-01-06 03:08:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:50.649835 | orchestrator | 2026-01-06 03:08:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:50.651311 | orchestrator | 2026-01-06 03:08:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:50.651418 | orchestrator | 2026-01-06 03:08:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:53.701249 | orchestrator | 2026-01-06 03:08:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:53.701818 | orchestrator | 2026-01-06 03:08:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:53.701855 | orchestrator | 2026-01-06 03:08:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:56.750216 | orchestrator | 2026-01-06 03:08:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:56.751905 | orchestrator | 2026-01-06 03:08:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:56.751943 | orchestrator | 2026-01-06 03:08:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:08:59.792950 | orchestrator | 2026-01-06 03:08:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:08:59.794763 | orchestrator | 2026-01-06 03:08:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:08:59.794812 | orchestrator | 2026-01-06 03:08:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:02.838507 | orchestrator | 2026-01-06 03:09:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:02.838714 | orchestrator | 2026-01-06 03:09:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:02.838736 | orchestrator | 2026-01-06 03:09:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:05.889662 | orchestrator | 2026-01-06 03:09:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:05.892292 | orchestrator | 2026-01-06 03:09:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:05.892429 | orchestrator | 2026-01-06 03:09:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:08.939180 | orchestrator | 2026-01-06 03:09:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:08.940822 | orchestrator | 2026-01-06 03:09:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:08.940894 | orchestrator | 2026-01-06 03:09:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:11.989938 | orchestrator | 2026-01-06 03:09:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:11.991377 | orchestrator | 2026-01-06 03:09:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:11.991418 | orchestrator | 2026-01-06 03:09:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:15.040613 | orchestrator | 2026-01-06 03:09:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:15.042273 | orchestrator | 2026-01-06 03:09:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:15.042443 | orchestrator | 2026-01-06 03:09:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:18.089974 | orchestrator | 2026-01-06 03:09:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:18.091453 | orchestrator | 2026-01-06 03:09:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:18.091511 | orchestrator | 2026-01-06 03:09:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:21.135654 | orchestrator | 2026-01-06 03:09:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:21.136559 | orchestrator | 2026-01-06 03:09:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:21.136750 | orchestrator | 2026-01-06 03:09:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:24.188251 | orchestrator | 2026-01-06 03:09:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:24.191226 | orchestrator | 2026-01-06 03:09:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:24.191271 | orchestrator | 2026-01-06 03:09:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:27.246624 | orchestrator | 2026-01-06 03:09:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:27.249132 | orchestrator | 2026-01-06 03:09:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:27.249190 | orchestrator | 2026-01-06 03:09:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:30.291654 | orchestrator | 2026-01-06 03:09:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:30.292677 | orchestrator | 2026-01-06 03:09:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:30.292714 | orchestrator | 2026-01-06 03:09:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:33.337662 | orchestrator | 2026-01-06 03:09:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:33.338823 | orchestrator | 2026-01-06 03:09:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:33.338853 | orchestrator | 2026-01-06 03:09:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:36.383881 | orchestrator | 2026-01-06 03:09:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:36.385232 | orchestrator | 2026-01-06 03:09:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:36.385603 | orchestrator | 2026-01-06 03:09:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:39.435986 | orchestrator | 2026-01-06 03:09:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:39.436503 | orchestrator | 2026-01-06 03:09:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:39.436557 | orchestrator | 2026-01-06 03:09:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:42.484807 | orchestrator | 2026-01-06 03:09:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:42.486861 | orchestrator | 2026-01-06 03:09:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:42.486928 | orchestrator | 2026-01-06 03:09:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:45.537221 | orchestrator | 2026-01-06 03:09:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:45.539569 | orchestrator | 2026-01-06 03:09:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:45.539622 | orchestrator | 2026-01-06 03:09:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:48.586552 | orchestrator | 2026-01-06 03:09:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:48.588470 | orchestrator | 2026-01-06 03:09:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:48.588519 | orchestrator | 2026-01-06 03:09:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:51.631095 | orchestrator | 2026-01-06 03:09:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:51.633021 | orchestrator | 2026-01-06 03:09:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:51.633251 | orchestrator | 2026-01-06 03:09:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:54.686542 | orchestrator | 2026-01-06 03:09:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:54.688610 | orchestrator | 2026-01-06 03:09:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:54.688730 | orchestrator | 2026-01-06 03:09:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:09:57.743879 | orchestrator | 2026-01-06 03:09:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:09:57.746959 | orchestrator | 2026-01-06 03:09:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:09:57.747027 | orchestrator | 2026-01-06 03:09:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:00.791358 | orchestrator | 2026-01-06 03:10:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:00.792804 | orchestrator | 2026-01-06 03:10:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:00.792874 | orchestrator | 2026-01-06 03:10:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:03.843526 | orchestrator | 2026-01-06 03:10:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:03.844461 | orchestrator | 2026-01-06 03:10:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:03.844539 | orchestrator | 2026-01-06 03:10:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:06.889683 | orchestrator | 2026-01-06 03:10:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:06.891205 | orchestrator | 2026-01-06 03:10:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:06.891263 | orchestrator | 2026-01-06 03:10:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:09.935505 | orchestrator | 2026-01-06 03:10:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:09.935834 | orchestrator | 2026-01-06 03:10:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:09.936154 | orchestrator | 2026-01-06 03:10:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:12.987148 | orchestrator | 2026-01-06 03:10:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:12.989608 | orchestrator | 2026-01-06 03:10:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:12.989649 | orchestrator | 2026-01-06 03:10:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:16.049053 | orchestrator | 2026-01-06 03:10:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:16.050253 | orchestrator | 2026-01-06 03:10:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:16.050302 | orchestrator | 2026-01-06 03:10:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:19.100759 | orchestrator | 2026-01-06 03:10:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:19.102811 | orchestrator | 2026-01-06 03:10:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:19.102947 | orchestrator | 2026-01-06 03:10:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:22.156698 | orchestrator | 2026-01-06 03:10:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:22.158588 | orchestrator | 2026-01-06 03:10:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:22.158682 | orchestrator | 2026-01-06 03:10:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:25.207137 | orchestrator | 2026-01-06 03:10:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:25.208241 | orchestrator | 2026-01-06 03:10:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:25.208458 | orchestrator | 2026-01-06 03:10:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:28.259361 | orchestrator | 2026-01-06 03:10:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:28.261549 | orchestrator | 2026-01-06 03:10:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:28.261595 | orchestrator | 2026-01-06 03:10:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:31.310726 | orchestrator | 2026-01-06 03:10:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:31.313720 | orchestrator | 2026-01-06 03:10:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:31.313811 | orchestrator | 2026-01-06 03:10:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:34.366687 | orchestrator | 2026-01-06 03:10:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:34.368701 | orchestrator | 2026-01-06 03:10:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:34.368757 | orchestrator | 2026-01-06 03:10:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:37.418548 | orchestrator | 2026-01-06 03:10:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:37.419219 | orchestrator | 2026-01-06 03:10:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:37.419374 | orchestrator | 2026-01-06 03:10:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:40.466479 | orchestrator | 2026-01-06 03:10:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:40.468533 | orchestrator | 2026-01-06 03:10:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:40.468599 | orchestrator | 2026-01-06 03:10:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:43.510153 | orchestrator | 2026-01-06 03:10:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:43.511761 | orchestrator | 2026-01-06 03:10:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:43.511831 | orchestrator | 2026-01-06 03:10:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:46.563965 | orchestrator | 2026-01-06 03:10:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:46.564692 | orchestrator | 2026-01-06 03:10:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:46.564834 | orchestrator | 2026-01-06 03:10:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:49.616836 | orchestrator | 2026-01-06 03:10:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:49.618600 | orchestrator | 2026-01-06 03:10:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:49.618663 | orchestrator | 2026-01-06 03:10:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:52.665169 | orchestrator | 2026-01-06 03:10:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:52.667013 | orchestrator | 2026-01-06 03:10:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:52.667099 | orchestrator | 2026-01-06 03:10:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:55.718928 | orchestrator | 2026-01-06 03:10:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:55.720209 | orchestrator | 2026-01-06 03:10:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:55.720274 | orchestrator | 2026-01-06 03:10:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:10:58.769490 | orchestrator | 2026-01-06 03:10:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:10:58.769814 | orchestrator | 2026-01-06 03:10:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:10:58.769835 | orchestrator | 2026-01-06 03:10:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:01.821823 | orchestrator | 2026-01-06 03:11:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:01.822989 | orchestrator | 2026-01-06 03:11:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:01.823018 | orchestrator | 2026-01-06 03:11:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:04.866467 | orchestrator | 2026-01-06 03:11:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:04.867849 | orchestrator | 2026-01-06 03:11:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:04.867892 | orchestrator | 2026-01-06 03:11:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:07.912640 | orchestrator | 2026-01-06 03:11:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:07.913192 | orchestrator | 2026-01-06 03:11:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:07.913258 | orchestrator | 2026-01-06 03:11:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:10.961855 | orchestrator | 2026-01-06 03:11:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:10.963997 | orchestrator | 2026-01-06 03:11:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:10.964380 | orchestrator | 2026-01-06 03:11:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:14.021662 | orchestrator | 2026-01-06 03:11:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:14.021763 | orchestrator | 2026-01-06 03:11:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:14.021779 | orchestrator | 2026-01-06 03:11:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:17.058474 | orchestrator | 2026-01-06 03:11:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:17.059645 | orchestrator | 2026-01-06 03:11:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:17.059699 | orchestrator | 2026-01-06 03:11:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:20.100214 | orchestrator | 2026-01-06 03:11:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:20.100949 | orchestrator | 2026-01-06 03:11:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:20.101038 | orchestrator | 2026-01-06 03:11:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:23.151475 | orchestrator | 2026-01-06 03:11:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:23.152801 | orchestrator | 2026-01-06 03:11:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:23.153033 | orchestrator | 2026-01-06 03:11:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:26.198903 | orchestrator | 2026-01-06 03:11:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:26.200991 | orchestrator | 2026-01-06 03:11:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:26.201234 | orchestrator | 2026-01-06 03:11:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:29.252232 | orchestrator | 2026-01-06 03:11:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:29.255209 | orchestrator | 2026-01-06 03:11:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:29.255286 | orchestrator | 2026-01-06 03:11:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:32.309717 | orchestrator | 2026-01-06 03:11:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:32.310401 | orchestrator | 2026-01-06 03:11:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:32.310428 | orchestrator | 2026-01-06 03:11:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:35.361743 | orchestrator | 2026-01-06 03:11:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:35.363721 | orchestrator | 2026-01-06 03:11:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:35.363823 | orchestrator | 2026-01-06 03:11:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:38.412196 | orchestrator | 2026-01-06 03:11:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:38.414420 | orchestrator | 2026-01-06 03:11:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:38.414499 | orchestrator | 2026-01-06 03:11:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:41.463836 | orchestrator | 2026-01-06 03:11:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:41.466233 | orchestrator | 2026-01-06 03:11:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:41.466404 | orchestrator | 2026-01-06 03:11:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:44.512278 | orchestrator | 2026-01-06 03:11:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:44.513393 | orchestrator | 2026-01-06 03:11:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:44.513439 | orchestrator | 2026-01-06 03:11:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:47.563184 | orchestrator | 2026-01-06 03:11:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:47.564253 | orchestrator | 2026-01-06 03:11:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:47.564599 | orchestrator | 2026-01-06 03:11:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:50.615772 | orchestrator | 2026-01-06 03:11:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:50.616508 | orchestrator | 2026-01-06 03:11:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:50.616938 | orchestrator | 2026-01-06 03:11:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:53.668368 | orchestrator | 2026-01-06 03:11:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:53.668631 | orchestrator | 2026-01-06 03:11:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:53.668759 | orchestrator | 2026-01-06 03:11:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:56.720716 | orchestrator | 2026-01-06 03:11:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:56.721948 | orchestrator | 2026-01-06 03:11:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:56.722283 | orchestrator | 2026-01-06 03:11:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:11:59.772876 | orchestrator | 2026-01-06 03:11:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:11:59.773611 | orchestrator | 2026-01-06 03:11:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:11:59.773682 | orchestrator | 2026-01-06 03:11:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:02.819869 | orchestrator | 2026-01-06 03:12:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:02.820588 | orchestrator | 2026-01-06 03:12:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:02.820637 | orchestrator | 2026-01-06 03:12:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:05.865100 | orchestrator | 2026-01-06 03:12:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:05.866671 | orchestrator | 2026-01-06 03:12:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:05.866789 | orchestrator | 2026-01-06 03:12:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:08.911601 | orchestrator | 2026-01-06 03:12:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:08.912855 | orchestrator | 2026-01-06 03:12:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:08.913105 | orchestrator | 2026-01-06 03:12:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:11.962328 | orchestrator | 2026-01-06 03:12:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:11.964332 | orchestrator | 2026-01-06 03:12:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:11.964395 | orchestrator | 2026-01-06 03:12:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:15.022724 | orchestrator | 2026-01-06 03:12:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:15.025338 | orchestrator | 2026-01-06 03:12:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:15.025404 | orchestrator | 2026-01-06 03:12:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:18.073246 | orchestrator | 2026-01-06 03:12:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:18.075364 | orchestrator | 2026-01-06 03:12:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:18.076003 | orchestrator | 2026-01-06 03:12:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:21.121033 | orchestrator | 2026-01-06 03:12:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:21.121108 | orchestrator | 2026-01-06 03:12:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:21.121115 | orchestrator | 2026-01-06 03:12:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:24.171608 | orchestrator | 2026-01-06 03:12:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:24.171985 | orchestrator | 2026-01-06 03:12:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:24.172009 | orchestrator | 2026-01-06 03:12:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:27.218855 | orchestrator | 2026-01-06 03:12:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:27.220476 | orchestrator | 2026-01-06 03:12:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:27.220565 | orchestrator | 2026-01-06 03:12:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:30.264985 | orchestrator | 2026-01-06 03:12:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:30.266415 | orchestrator | 2026-01-06 03:12:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:30.266474 | orchestrator | 2026-01-06 03:12:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:33.312097 | orchestrator | 2026-01-06 03:12:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:33.314638 | orchestrator | 2026-01-06 03:12:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:33.314706 | orchestrator | 2026-01-06 03:12:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:36.366290 | orchestrator | 2026-01-06 03:12:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:36.366903 | orchestrator | 2026-01-06 03:12:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:36.366940 | orchestrator | 2026-01-06 03:12:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:39.413162 | orchestrator | 2026-01-06 03:12:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:39.414473 | orchestrator | 2026-01-06 03:12:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:39.414559 | orchestrator | 2026-01-06 03:12:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:42.457050 | orchestrator | 2026-01-06 03:12:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:42.457130 | orchestrator | 2026-01-06 03:12:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:42.457170 | orchestrator | 2026-01-06 03:12:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:45.504843 | orchestrator | 2026-01-06 03:12:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:45.504975 | orchestrator | 2026-01-06 03:12:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:45.504999 | orchestrator | 2026-01-06 03:12:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:48.556412 | orchestrator | 2026-01-06 03:12:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:48.557252 | orchestrator | 2026-01-06 03:12:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:48.557401 | orchestrator | 2026-01-06 03:12:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:51.606611 | orchestrator | 2026-01-06 03:12:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:51.608400 | orchestrator | 2026-01-06 03:12:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:51.608557 | orchestrator | 2026-01-06 03:12:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:54.662234 | orchestrator | 2026-01-06 03:12:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:54.664652 | orchestrator | 2026-01-06 03:12:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:54.664705 | orchestrator | 2026-01-06 03:12:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:12:57.718844 | orchestrator | 2026-01-06 03:12:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:12:57.719287 | orchestrator | 2026-01-06 03:12:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:12:57.719425 | orchestrator | 2026-01-06 03:12:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:00.773570 | orchestrator | 2026-01-06 03:13:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:00.775292 | orchestrator | 2026-01-06 03:13:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:00.775355 | orchestrator | 2026-01-06 03:13:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:03.828465 | orchestrator | 2026-01-06 03:13:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:03.829447 | orchestrator | 2026-01-06 03:13:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:03.829491 | orchestrator | 2026-01-06 03:13:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:06.870800 | orchestrator | 2026-01-06 03:13:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:06.870901 | orchestrator | 2026-01-06 03:13:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:06.870981 | orchestrator | 2026-01-06 03:13:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:09.917278 | orchestrator | 2026-01-06 03:13:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:09.917468 | orchestrator | 2026-01-06 03:13:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:09.917488 | orchestrator | 2026-01-06 03:13:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:12.968474 | orchestrator | 2026-01-06 03:13:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:12.969915 | orchestrator | 2026-01-06 03:13:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:12.970186 | orchestrator | 2026-01-06 03:13:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:16.022222 | orchestrator | 2026-01-06 03:13:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:16.022765 | orchestrator | 2026-01-06 03:13:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:16.022817 | orchestrator | 2026-01-06 03:13:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:19.066381 | orchestrator | 2026-01-06 03:13:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:19.067121 | orchestrator | 2026-01-06 03:13:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:19.067157 | orchestrator | 2026-01-06 03:13:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:22.116288 | orchestrator | 2026-01-06 03:13:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:22.117729 | orchestrator | 2026-01-06 03:13:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:22.117796 | orchestrator | 2026-01-06 03:13:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:25.157307 | orchestrator | 2026-01-06 03:13:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:25.159252 | orchestrator | 2026-01-06 03:13:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:25.159318 | orchestrator | 2026-01-06 03:13:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:28.202137 | orchestrator | 2026-01-06 03:13:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:28.203256 | orchestrator | 2026-01-06 03:13:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:28.203288 | orchestrator | 2026-01-06 03:13:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:31.260391 | orchestrator | 2026-01-06 03:13:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:31.261333 | orchestrator | 2026-01-06 03:13:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:31.261394 | orchestrator | 2026-01-06 03:13:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:34.304657 | orchestrator | 2026-01-06 03:13:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:34.305112 | orchestrator | 2026-01-06 03:13:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:34.305137 | orchestrator | 2026-01-06 03:13:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:37.355460 | orchestrator | 2026-01-06 03:13:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:37.357150 | orchestrator | 2026-01-06 03:13:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:37.357272 | orchestrator | 2026-01-06 03:13:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:40.404429 | orchestrator | 2026-01-06 03:13:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:40.408904 | orchestrator | 2026-01-06 03:13:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:40.408993 | orchestrator | 2026-01-06 03:13:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:43.456135 | orchestrator | 2026-01-06 03:13:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:43.456869 | orchestrator | 2026-01-06 03:13:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:43.456902 | orchestrator | 2026-01-06 03:13:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:46.501004 | orchestrator | 2026-01-06 03:13:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:46.501555 | orchestrator | 2026-01-06 03:13:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:46.501588 | orchestrator | 2026-01-06 03:13:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:49.549321 | orchestrator | 2026-01-06 03:13:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:49.550348 | orchestrator | 2026-01-06 03:13:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:49.550402 | orchestrator | 2026-01-06 03:13:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:52.601034 | orchestrator | 2026-01-06 03:13:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:52.601581 | orchestrator | 2026-01-06 03:13:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:52.601641 | orchestrator | 2026-01-06 03:13:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:55.650617 | orchestrator | 2026-01-06 03:13:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:55.652325 | orchestrator | 2026-01-06 03:13:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:55.652370 | orchestrator | 2026-01-06 03:13:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:13:58.700101 | orchestrator | 2026-01-06 03:13:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:13:58.702925 | orchestrator | 2026-01-06 03:13:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:13:58.703004 | orchestrator | 2026-01-06 03:13:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:01.753955 | orchestrator | 2026-01-06 03:14:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:01.754356 | orchestrator | 2026-01-06 03:14:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:01.754391 | orchestrator | 2026-01-06 03:14:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:04.801641 | orchestrator | 2026-01-06 03:14:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:04.802687 | orchestrator | 2026-01-06 03:14:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:04.802792 | orchestrator | 2026-01-06 03:14:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:07.854502 | orchestrator | 2026-01-06 03:14:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:07.854877 | orchestrator | 2026-01-06 03:14:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:07.855236 | orchestrator | 2026-01-06 03:14:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:10.904992 | orchestrator | 2026-01-06 03:14:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:10.905713 | orchestrator | 2026-01-06 03:14:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:10.905788 | orchestrator | 2026-01-06 03:14:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:13.942748 | orchestrator | 2026-01-06 03:14:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:13.945107 | orchestrator | 2026-01-06 03:14:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:13.945181 | orchestrator | 2026-01-06 03:14:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:16.991966 | orchestrator | 2026-01-06 03:14:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:16.993339 | orchestrator | 2026-01-06 03:14:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:16.993453 | orchestrator | 2026-01-06 03:14:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:20.041450 | orchestrator | 2026-01-06 03:14:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:20.042657 | orchestrator | 2026-01-06 03:14:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:20.042709 | orchestrator | 2026-01-06 03:14:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:23.092924 | orchestrator | 2026-01-06 03:14:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:23.093969 | orchestrator | 2026-01-06 03:14:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:23.094104 | orchestrator | 2026-01-06 03:14:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:26.140856 | orchestrator | 2026-01-06 03:14:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:26.142349 | orchestrator | 2026-01-06 03:14:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:26.142398 | orchestrator | 2026-01-06 03:14:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:29.190160 | orchestrator | 2026-01-06 03:14:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:29.191529 | orchestrator | 2026-01-06 03:14:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:29.191763 | orchestrator | 2026-01-06 03:14:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:32.237060 | orchestrator | 2026-01-06 03:14:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:32.238332 | orchestrator | 2026-01-06 03:14:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:32.238391 | orchestrator | 2026-01-06 03:14:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:35.278150 | orchestrator | 2026-01-06 03:14:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:35.279156 | orchestrator | 2026-01-06 03:14:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:35.279219 | orchestrator | 2026-01-06 03:14:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:38.319762 | orchestrator | 2026-01-06 03:14:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:38.321080 | orchestrator | 2026-01-06 03:14:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:38.321218 | orchestrator | 2026-01-06 03:14:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:41.368247 | orchestrator | 2026-01-06 03:14:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:41.369729 | orchestrator | 2026-01-06 03:14:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:41.369784 | orchestrator | 2026-01-06 03:14:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:44.417303 | orchestrator | 2026-01-06 03:14:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:44.418386 | orchestrator | 2026-01-06 03:14:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:44.418801 | orchestrator | 2026-01-06 03:14:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:47.463641 | orchestrator | 2026-01-06 03:14:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:47.465101 | orchestrator | 2026-01-06 03:14:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:47.465146 | orchestrator | 2026-01-06 03:14:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:50.516072 | orchestrator | 2026-01-06 03:14:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:50.517381 | orchestrator | 2026-01-06 03:14:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:50.517452 | orchestrator | 2026-01-06 03:14:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:53.566415 | orchestrator | 2026-01-06 03:14:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:53.567699 | orchestrator | 2026-01-06 03:14:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:53.567746 | orchestrator | 2026-01-06 03:14:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:56.619932 | orchestrator | 2026-01-06 03:14:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:56.621756 | orchestrator | 2026-01-06 03:14:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:56.621882 | orchestrator | 2026-01-06 03:14:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:14:59.673037 | orchestrator | 2026-01-06 03:14:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:14:59.674424 | orchestrator | 2026-01-06 03:14:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:14:59.674475 | orchestrator | 2026-01-06 03:14:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:02.725369 | orchestrator | 2026-01-06 03:15:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:02.726885 | orchestrator | 2026-01-06 03:15:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:02.727021 | orchestrator | 2026-01-06 03:15:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:05.774920 | orchestrator | 2026-01-06 03:15:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:05.777421 | orchestrator | 2026-01-06 03:15:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:05.777458 | orchestrator | 2026-01-06 03:15:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:08.822628 | orchestrator | 2026-01-06 03:15:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:08.823067 | orchestrator | 2026-01-06 03:15:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:08.823108 | orchestrator | 2026-01-06 03:15:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:11.866870 | orchestrator | 2026-01-06 03:15:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:11.867816 | orchestrator | 2026-01-06 03:15:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:11.867859 | orchestrator | 2026-01-06 03:15:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:14.918344 | orchestrator | 2026-01-06 03:15:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:14.920431 | orchestrator | 2026-01-06 03:15:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:14.920480 | orchestrator | 2026-01-06 03:15:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:17.966878 | orchestrator | 2026-01-06 03:15:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:17.968258 | orchestrator | 2026-01-06 03:15:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:17.968319 | orchestrator | 2026-01-06 03:15:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:21.017438 | orchestrator | 2026-01-06 03:15:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:21.019449 | orchestrator | 2026-01-06 03:15:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:21.019511 | orchestrator | 2026-01-06 03:15:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:24.066900 | orchestrator | 2026-01-06 03:15:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:24.067048 | orchestrator | 2026-01-06 03:15:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:24.067303 | orchestrator | 2026-01-06 03:15:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:27.117170 | orchestrator | 2026-01-06 03:15:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:27.118628 | orchestrator | 2026-01-06 03:15:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:27.118679 | orchestrator | 2026-01-06 03:15:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:30.170829 | orchestrator | 2026-01-06 03:15:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:30.172787 | orchestrator | 2026-01-06 03:15:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:30.172894 | orchestrator | 2026-01-06 03:15:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:33.221772 | orchestrator | 2026-01-06 03:15:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:33.222974 | orchestrator | 2026-01-06 03:15:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:33.223031 | orchestrator | 2026-01-06 03:15:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:36.269755 | orchestrator | 2026-01-06 03:15:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:36.272261 | orchestrator | 2026-01-06 03:15:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:36.272384 | orchestrator | 2026-01-06 03:15:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:39.321779 | orchestrator | 2026-01-06 03:15:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:39.324430 | orchestrator | 2026-01-06 03:15:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:39.324485 | orchestrator | 2026-01-06 03:15:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:42.368990 | orchestrator | 2026-01-06 03:15:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:42.369615 | orchestrator | 2026-01-06 03:15:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:42.370534 | orchestrator | 2026-01-06 03:15:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:45.417235 | orchestrator | 2026-01-06 03:15:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:45.418097 | orchestrator | 2026-01-06 03:15:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:45.418146 | orchestrator | 2026-01-06 03:15:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:48.461405 | orchestrator | 2026-01-06 03:15:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:48.463234 | orchestrator | 2026-01-06 03:15:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:48.463404 | orchestrator | 2026-01-06 03:15:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:51.509794 | orchestrator | 2026-01-06 03:15:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:51.511434 | orchestrator | 2026-01-06 03:15:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:51.511524 | orchestrator | 2026-01-06 03:15:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:54.560637 | orchestrator | 2026-01-06 03:15:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:54.562189 | orchestrator | 2026-01-06 03:15:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:54.562252 | orchestrator | 2026-01-06 03:15:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:15:57.610658 | orchestrator | 2026-01-06 03:15:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:15:57.611278 | orchestrator | 2026-01-06 03:15:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:15:57.611318 | orchestrator | 2026-01-06 03:15:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:00.658667 | orchestrator | 2026-01-06 03:16:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:00.661360 | orchestrator | 2026-01-06 03:16:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:00.661463 | orchestrator | 2026-01-06 03:16:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:03.704402 | orchestrator | 2026-01-06 03:16:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:03.705997 | orchestrator | 2026-01-06 03:16:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:03.706194 | orchestrator | 2026-01-06 03:16:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:06.750608 | orchestrator | 2026-01-06 03:16:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:06.752447 | orchestrator | 2026-01-06 03:16:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:06.753087 | orchestrator | 2026-01-06 03:16:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:09.794183 | orchestrator | 2026-01-06 03:16:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:09.795081 | orchestrator | 2026-01-06 03:16:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:09.795127 | orchestrator | 2026-01-06 03:16:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:12.844224 | orchestrator | 2026-01-06 03:16:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:12.845464 | orchestrator | 2026-01-06 03:16:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:12.845508 | orchestrator | 2026-01-06 03:16:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:15.899474 | orchestrator | 2026-01-06 03:16:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:15.900182 | orchestrator | 2026-01-06 03:16:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:15.900752 | orchestrator | 2026-01-06 03:16:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:18.954128 | orchestrator | 2026-01-06 03:16:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:18.955974 | orchestrator | 2026-01-06 03:16:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:18.956086 | orchestrator | 2026-01-06 03:16:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:22.005116 | orchestrator | 2026-01-06 03:16:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:22.008676 | orchestrator | 2026-01-06 03:16:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:22.008754 | orchestrator | 2026-01-06 03:16:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:25.065476 | orchestrator | 2026-01-06 03:16:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:25.067673 | orchestrator | 2026-01-06 03:16:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:25.067732 | orchestrator | 2026-01-06 03:16:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:28.106363 | orchestrator | 2026-01-06 03:16:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:28.106441 | orchestrator | 2026-01-06 03:16:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:28.106448 | orchestrator | 2026-01-06 03:16:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:31.154181 | orchestrator | 2026-01-06 03:16:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:31.155738 | orchestrator | 2026-01-06 03:16:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:31.155776 | orchestrator | 2026-01-06 03:16:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:34.205126 | orchestrator | 2026-01-06 03:16:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:34.207195 | orchestrator | 2026-01-06 03:16:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:34.207268 | orchestrator | 2026-01-06 03:16:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:37.254183 | orchestrator | 2026-01-06 03:16:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:37.255234 | orchestrator | 2026-01-06 03:16:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:37.255276 | orchestrator | 2026-01-06 03:16:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:40.312112 | orchestrator | 2026-01-06 03:16:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:40.314140 | orchestrator | 2026-01-06 03:16:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:40.314214 | orchestrator | 2026-01-06 03:16:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:43.360958 | orchestrator | 2026-01-06 03:16:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:43.361888 | orchestrator | 2026-01-06 03:16:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:43.361922 | orchestrator | 2026-01-06 03:16:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:46.412830 | orchestrator | 2026-01-06 03:16:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:46.414153 | orchestrator | 2026-01-06 03:16:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:46.414185 | orchestrator | 2026-01-06 03:16:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:49.458285 | orchestrator | 2026-01-06 03:16:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:49.458719 | orchestrator | 2026-01-06 03:16:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:49.458748 | orchestrator | 2026-01-06 03:16:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:52.505913 | orchestrator | 2026-01-06 03:16:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:52.506139 | orchestrator | 2026-01-06 03:16:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:52.506157 | orchestrator | 2026-01-06 03:16:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:55.554676 | orchestrator | 2026-01-06 03:16:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:55.557023 | orchestrator | 2026-01-06 03:16:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:55.557107 | orchestrator | 2026-01-06 03:16:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:16:58.593062 | orchestrator | 2026-01-06 03:16:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:16:58.595518 | orchestrator | 2026-01-06 03:16:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:16:58.595625 | orchestrator | 2026-01-06 03:16:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:01.639396 | orchestrator | 2026-01-06 03:17:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:01.641286 | orchestrator | 2026-01-06 03:17:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:01.641390 | orchestrator | 2026-01-06 03:17:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:04.685969 | orchestrator | 2026-01-06 03:17:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:04.686494 | orchestrator | 2026-01-06 03:17:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:04.686532 | orchestrator | 2026-01-06 03:17:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:07.736185 | orchestrator | 2026-01-06 03:17:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:07.737154 | orchestrator | 2026-01-06 03:17:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:07.737579 | orchestrator | 2026-01-06 03:17:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:10.792715 | orchestrator | 2026-01-06 03:17:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:10.793997 | orchestrator | 2026-01-06 03:17:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:10.794100 | orchestrator | 2026-01-06 03:17:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:13.848727 | orchestrator | 2026-01-06 03:17:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:13.850604 | orchestrator | 2026-01-06 03:17:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:13.850714 | orchestrator | 2026-01-06 03:17:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:16.899773 | orchestrator | 2026-01-06 03:17:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:16.901196 | orchestrator | 2026-01-06 03:17:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:16.901235 | orchestrator | 2026-01-06 03:17:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:19.953359 | orchestrator | 2026-01-06 03:17:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:19.955802 | orchestrator | 2026-01-06 03:17:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:19.956128 | orchestrator | 2026-01-06 03:17:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:23.003344 | orchestrator | 2026-01-06 03:17:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:23.006390 | orchestrator | 2026-01-06 03:17:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:23.006508 | orchestrator | 2026-01-06 03:17:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:26.067903 | orchestrator | 2026-01-06 03:17:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:26.069233 | orchestrator | 2026-01-06 03:17:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:26.069361 | orchestrator | 2026-01-06 03:17:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:29.121209 | orchestrator | 2026-01-06 03:17:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:29.121610 | orchestrator | 2026-01-06 03:17:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:29.121648 | orchestrator | 2026-01-06 03:17:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:32.171177 | orchestrator | 2026-01-06 03:17:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:32.172224 | orchestrator | 2026-01-06 03:17:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:32.172431 | orchestrator | 2026-01-06 03:17:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:35.222647 | orchestrator | 2026-01-06 03:17:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:35.224077 | orchestrator | 2026-01-06 03:17:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:35.224158 | orchestrator | 2026-01-06 03:17:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:38.276697 | orchestrator | 2026-01-06 03:17:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:38.278337 | orchestrator | 2026-01-06 03:17:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:38.278421 | orchestrator | 2026-01-06 03:17:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:41.325668 | orchestrator | 2026-01-06 03:17:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:41.328446 | orchestrator | 2026-01-06 03:17:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:41.328555 | orchestrator | 2026-01-06 03:17:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:44.376364 | orchestrator | 2026-01-06 03:17:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:44.380632 | orchestrator | 2026-01-06 03:17:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:44.380798 | orchestrator | 2026-01-06 03:17:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:47.423268 | orchestrator | 2026-01-06 03:17:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:47.423785 | orchestrator | 2026-01-06 03:17:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:47.423889 | orchestrator | 2026-01-06 03:17:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:50.472010 | orchestrator | 2026-01-06 03:17:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:50.474625 | orchestrator | 2026-01-06 03:17:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:50.474656 | orchestrator | 2026-01-06 03:17:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:53.521527 | orchestrator | 2026-01-06 03:17:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:53.521908 | orchestrator | 2026-01-06 03:17:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:53.522505 | orchestrator | 2026-01-06 03:17:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:56.567703 | orchestrator | 2026-01-06 03:17:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:56.567901 | orchestrator | 2026-01-06 03:17:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:56.567987 | orchestrator | 2026-01-06 03:17:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:17:59.617896 | orchestrator | 2026-01-06 03:17:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:17:59.619157 | orchestrator | 2026-01-06 03:17:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:17:59.619204 | orchestrator | 2026-01-06 03:17:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:02.660896 | orchestrator | 2026-01-06 03:18:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:02.662380 | orchestrator | 2026-01-06 03:18:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:02.662427 | orchestrator | 2026-01-06 03:18:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:05.706905 | orchestrator | 2026-01-06 03:18:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:05.709235 | orchestrator | 2026-01-06 03:18:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:05.709355 | orchestrator | 2026-01-06 03:18:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:08.755105 | orchestrator | 2026-01-06 03:18:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:08.755991 | orchestrator | 2026-01-06 03:18:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:08.756070 | orchestrator | 2026-01-06 03:18:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:11.804116 | orchestrator | 2026-01-06 03:18:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:11.805742 | orchestrator | 2026-01-06 03:18:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:11.805805 | orchestrator | 2026-01-06 03:18:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:14.855826 | orchestrator | 2026-01-06 03:18:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:14.857318 | orchestrator | 2026-01-06 03:18:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:14.857354 | orchestrator | 2026-01-06 03:18:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:17.901446 | orchestrator | 2026-01-06 03:18:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:17.902797 | orchestrator | 2026-01-06 03:18:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:17.902900 | orchestrator | 2026-01-06 03:18:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:20.946371 | orchestrator | 2026-01-06 03:18:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:20.948046 | orchestrator | 2026-01-06 03:18:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:20.948102 | orchestrator | 2026-01-06 03:18:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:23.991335 | orchestrator | 2026-01-06 03:18:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:23.992566 | orchestrator | 2026-01-06 03:18:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:23.992714 | orchestrator | 2026-01-06 03:18:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:27.044758 | orchestrator | 2026-01-06 03:18:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:27.047036 | orchestrator | 2026-01-06 03:18:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:27.047114 | orchestrator | 2026-01-06 03:18:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:30.090884 | orchestrator | 2026-01-06 03:18:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:30.093756 | orchestrator | 2026-01-06 03:18:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:30.093794 | orchestrator | 2026-01-06 03:18:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:33.133281 | orchestrator | 2026-01-06 03:18:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:33.134309 | orchestrator | 2026-01-06 03:18:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:33.134628 | orchestrator | 2026-01-06 03:18:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:36.179211 | orchestrator | 2026-01-06 03:18:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:36.179684 | orchestrator | 2026-01-06 03:18:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:36.179940 | orchestrator | 2026-01-06 03:18:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:39.225710 | orchestrator | 2026-01-06 03:18:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:39.226885 | orchestrator | 2026-01-06 03:18:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:39.227127 | orchestrator | 2026-01-06 03:18:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:42.278668 | orchestrator | 2026-01-06 03:18:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:42.280881 | orchestrator | 2026-01-06 03:18:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:42.280930 | orchestrator | 2026-01-06 03:18:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:45.329813 | orchestrator | 2026-01-06 03:18:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:45.332141 | orchestrator | 2026-01-06 03:18:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:45.332268 | orchestrator | 2026-01-06 03:18:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:48.390408 | orchestrator | 2026-01-06 03:18:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:48.391798 | orchestrator | 2026-01-06 03:18:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:48.391851 | orchestrator | 2026-01-06 03:18:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:51.435383 | orchestrator | 2026-01-06 03:18:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:51.436893 | orchestrator | 2026-01-06 03:18:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:51.437010 | orchestrator | 2026-01-06 03:18:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:54.488574 | orchestrator | 2026-01-06 03:18:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:54.489730 | orchestrator | 2026-01-06 03:18:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:54.489762 | orchestrator | 2026-01-06 03:18:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:18:57.541210 | orchestrator | 2026-01-06 03:18:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:18:57.543837 | orchestrator | 2026-01-06 03:18:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:18:57.544382 | orchestrator | 2026-01-06 03:18:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:00.597712 | orchestrator | 2026-01-06 03:19:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:00.598314 | orchestrator | 2026-01-06 03:19:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:00.598420 | orchestrator | 2026-01-06 03:19:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:03.645968 | orchestrator | 2026-01-06 03:19:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:03.646992 | orchestrator | 2026-01-06 03:19:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:03.647019 | orchestrator | 2026-01-06 03:19:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:06.692968 | orchestrator | 2026-01-06 03:19:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:06.695545 | orchestrator | 2026-01-06 03:19:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:06.695593 | orchestrator | 2026-01-06 03:19:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:09.747483 | orchestrator | 2026-01-06 03:19:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:09.748088 | orchestrator | 2026-01-06 03:19:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:09.748194 | orchestrator | 2026-01-06 03:19:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:12.791333 | orchestrator | 2026-01-06 03:19:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:12.791622 | orchestrator | 2026-01-06 03:19:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:12.791662 | orchestrator | 2026-01-06 03:19:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:15.840176 | orchestrator | 2026-01-06 03:19:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:15.840326 | orchestrator | 2026-01-06 03:19:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:15.840361 | orchestrator | 2026-01-06 03:19:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:18.884446 | orchestrator | 2026-01-06 03:19:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:18.885582 | orchestrator | 2026-01-06 03:19:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:18.885612 | orchestrator | 2026-01-06 03:19:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:21.932122 | orchestrator | 2026-01-06 03:19:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:21.932862 | orchestrator | 2026-01-06 03:19:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:21.932909 | orchestrator | 2026-01-06 03:19:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:24.973435 | orchestrator | 2026-01-06 03:19:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:24.976313 | orchestrator | 2026-01-06 03:19:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:24.976413 | orchestrator | 2026-01-06 03:19:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:28.029770 | orchestrator | 2026-01-06 03:19:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:28.031483 | orchestrator | 2026-01-06 03:19:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:28.031565 | orchestrator | 2026-01-06 03:19:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:31.085726 | orchestrator | 2026-01-06 03:19:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:31.088841 | orchestrator | 2026-01-06 03:19:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:31.088900 | orchestrator | 2026-01-06 03:19:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:34.133849 | orchestrator | 2026-01-06 03:19:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:34.134187 | orchestrator | 2026-01-06 03:19:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:34.134286 | orchestrator | 2026-01-06 03:19:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:37.181792 | orchestrator | 2026-01-06 03:19:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:37.182948 | orchestrator | 2026-01-06 03:19:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:37.183317 | orchestrator | 2026-01-06 03:19:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:40.233562 | orchestrator | 2026-01-06 03:19:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:40.235120 | orchestrator | 2026-01-06 03:19:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:40.235340 | orchestrator | 2026-01-06 03:19:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:43.284420 | orchestrator | 2026-01-06 03:19:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:43.286339 | orchestrator | 2026-01-06 03:19:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:43.286374 | orchestrator | 2026-01-06 03:19:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:46.346741 | orchestrator | 2026-01-06 03:19:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:46.347090 | orchestrator | 2026-01-06 03:19:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:46.347171 | orchestrator | 2026-01-06 03:19:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:49.414304 | orchestrator | 2026-01-06 03:19:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:49.416710 | orchestrator | 2026-01-06 03:19:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:49.416827 | orchestrator | 2026-01-06 03:19:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:52.471666 | orchestrator | 2026-01-06 03:19:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:52.472694 | orchestrator | 2026-01-06 03:19:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:52.472756 | orchestrator | 2026-01-06 03:19:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:55.518078 | orchestrator | 2026-01-06 03:19:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:55.519897 | orchestrator | 2026-01-06 03:19:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:55.519955 | orchestrator | 2026-01-06 03:19:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:19:58.568974 | orchestrator | 2026-01-06 03:19:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:19:58.569638 | orchestrator | 2026-01-06 03:19:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:19:58.570111 | orchestrator | 2026-01-06 03:19:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:01.619739 | orchestrator | 2026-01-06 03:20:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:01.620593 | orchestrator | 2026-01-06 03:20:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:01.620791 | orchestrator | 2026-01-06 03:20:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:04.672357 | orchestrator | 2026-01-06 03:20:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:04.673980 | orchestrator | 2026-01-06 03:20:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:04.674418 | orchestrator | 2026-01-06 03:20:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:07.726261 | orchestrator | 2026-01-06 03:20:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:07.727706 | orchestrator | 2026-01-06 03:20:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:07.727748 | orchestrator | 2026-01-06 03:20:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:10.773645 | orchestrator | 2026-01-06 03:20:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:10.775054 | orchestrator | 2026-01-06 03:20:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:10.775099 | orchestrator | 2026-01-06 03:20:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:13.822168 | orchestrator | 2026-01-06 03:20:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:13.822377 | orchestrator | 2026-01-06 03:20:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:13.822582 | orchestrator | 2026-01-06 03:20:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:16.871261 | orchestrator | 2026-01-06 03:20:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:16.871937 | orchestrator | 2026-01-06 03:20:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:16.871971 | orchestrator | 2026-01-06 03:20:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:19.924930 | orchestrator | 2026-01-06 03:20:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:19.925973 | orchestrator | 2026-01-06 03:20:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:19.926009 | orchestrator | 2026-01-06 03:20:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:22.970098 | orchestrator | 2026-01-06 03:20:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:22.973050 | orchestrator | 2026-01-06 03:20:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:22.973128 | orchestrator | 2026-01-06 03:20:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:26.022623 | orchestrator | 2026-01-06 03:20:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:26.023912 | orchestrator | 2026-01-06 03:20:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:26.024191 | orchestrator | 2026-01-06 03:20:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:29.078114 | orchestrator | 2026-01-06 03:20:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:29.078661 | orchestrator | 2026-01-06 03:20:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:29.078733 | orchestrator | 2026-01-06 03:20:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:32.127805 | orchestrator | 2026-01-06 03:20:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:32.129968 | orchestrator | 2026-01-06 03:20:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:32.130363 | orchestrator | 2026-01-06 03:20:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:35.184851 | orchestrator | 2026-01-06 03:20:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:35.187124 | orchestrator | 2026-01-06 03:20:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:35.187236 | orchestrator | 2026-01-06 03:20:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:38.242231 | orchestrator | 2026-01-06 03:20:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:38.243042 | orchestrator | 2026-01-06 03:20:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:38.243083 | orchestrator | 2026-01-06 03:20:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:41.294650 | orchestrator | 2026-01-06 03:20:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:41.296373 | orchestrator | 2026-01-06 03:20:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:41.296422 | orchestrator | 2026-01-06 03:20:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:44.346237 | orchestrator | 2026-01-06 03:20:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:44.350489 | orchestrator | 2026-01-06 03:20:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:44.350689 | orchestrator | 2026-01-06 03:20:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:47.402284 | orchestrator | 2026-01-06 03:20:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:47.402860 | orchestrator | 2026-01-06 03:20:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:47.402973 | orchestrator | 2026-01-06 03:20:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:50.459007 | orchestrator | 2026-01-06 03:20:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:50.460281 | orchestrator | 2026-01-06 03:20:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:50.460563 | orchestrator | 2026-01-06 03:20:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:53.507757 | orchestrator | 2026-01-06 03:20:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:53.510600 | orchestrator | 2026-01-06 03:20:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:53.510768 | orchestrator | 2026-01-06 03:20:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:56.556893 | orchestrator | 2026-01-06 03:20:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:56.557828 | orchestrator | 2026-01-06 03:20:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:56.557860 | orchestrator | 2026-01-06 03:20:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:20:59.608432 | orchestrator | 2026-01-06 03:20:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:20:59.609347 | orchestrator | 2026-01-06 03:20:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:20:59.609389 | orchestrator | 2026-01-06 03:20:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:02.658969 | orchestrator | 2026-01-06 03:21:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:02.659380 | orchestrator | 2026-01-06 03:21:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:02.660060 | orchestrator | 2026-01-06 03:21:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:05.714796 | orchestrator | 2026-01-06 03:21:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:05.715418 | orchestrator | 2026-01-06 03:21:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:05.715697 | orchestrator | 2026-01-06 03:21:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:08.762593 | orchestrator | 2026-01-06 03:21:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:08.763892 | orchestrator | 2026-01-06 03:21:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:08.763942 | orchestrator | 2026-01-06 03:21:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:11.812441 | orchestrator | 2026-01-06 03:21:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:11.814395 | orchestrator | 2026-01-06 03:21:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:11.814548 | orchestrator | 2026-01-06 03:21:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:14.860685 | orchestrator | 2026-01-06 03:21:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:14.862143 | orchestrator | 2026-01-06 03:21:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:14.862187 | orchestrator | 2026-01-06 03:21:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:17.914187 | orchestrator | 2026-01-06 03:21:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:17.914317 | orchestrator | 2026-01-06 03:21:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:17.914345 | orchestrator | 2026-01-06 03:21:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:20.964163 | orchestrator | 2026-01-06 03:21:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:20.966985 | orchestrator | 2026-01-06 03:21:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:20.967061 | orchestrator | 2026-01-06 03:21:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:24.018458 | orchestrator | 2026-01-06 03:21:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:24.024153 | orchestrator | 2026-01-06 03:21:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:24.024287 | orchestrator | 2026-01-06 03:21:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:27.081637 | orchestrator | 2026-01-06 03:21:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:27.082890 | orchestrator | 2026-01-06 03:21:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:27.082929 | orchestrator | 2026-01-06 03:21:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:30.128316 | orchestrator | 2026-01-06 03:21:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:30.130094 | orchestrator | 2026-01-06 03:21:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:30.130135 | orchestrator | 2026-01-06 03:21:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:33.187399 | orchestrator | 2026-01-06 03:21:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:33.189688 | orchestrator | 2026-01-06 03:21:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:33.189754 | orchestrator | 2026-01-06 03:21:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:36.232961 | orchestrator | 2026-01-06 03:21:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:36.234878 | orchestrator | 2026-01-06 03:21:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:36.235021 | orchestrator | 2026-01-06 03:21:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:39.289598 | orchestrator | 2026-01-06 03:21:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:39.291743 | orchestrator | 2026-01-06 03:21:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:39.291851 | orchestrator | 2026-01-06 03:21:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:42.334352 | orchestrator | 2026-01-06 03:21:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:42.335623 | orchestrator | 2026-01-06 03:21:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:42.335684 | orchestrator | 2026-01-06 03:21:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:45.380585 | orchestrator | 2026-01-06 03:21:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:45.381901 | orchestrator | 2026-01-06 03:21:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:45.381929 | orchestrator | 2026-01-06 03:21:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:48.431181 | orchestrator | 2026-01-06 03:21:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:48.433780 | orchestrator | 2026-01-06 03:21:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:48.435456 | orchestrator | 2026-01-06 03:21:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:51.484467 | orchestrator | 2026-01-06 03:21:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:51.486074 | orchestrator | 2026-01-06 03:21:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:51.486104 | orchestrator | 2026-01-06 03:21:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:54.543202 | orchestrator | 2026-01-06 03:21:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:54.544887 | orchestrator | 2026-01-06 03:21:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:54.544978 | orchestrator | 2026-01-06 03:21:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:21:57.594325 | orchestrator | 2026-01-06 03:21:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:21:57.595559 | orchestrator | 2026-01-06 03:21:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:21:57.595613 | orchestrator | 2026-01-06 03:21:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:00.650259 | orchestrator | 2026-01-06 03:22:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:00.652820 | orchestrator | 2026-01-06 03:22:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:00.652865 | orchestrator | 2026-01-06 03:22:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:03.701983 | orchestrator | 2026-01-06 03:22:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:03.705739 | orchestrator | 2026-01-06 03:22:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:03.705927 | orchestrator | 2026-01-06 03:22:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:06.761032 | orchestrator | 2026-01-06 03:22:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:06.762656 | orchestrator | 2026-01-06 03:22:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:06.762767 | orchestrator | 2026-01-06 03:22:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:09.814306 | orchestrator | 2026-01-06 03:22:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:09.817300 | orchestrator | 2026-01-06 03:22:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:09.817361 | orchestrator | 2026-01-06 03:22:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:12.866920 | orchestrator | 2026-01-06 03:22:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:12.869380 | orchestrator | 2026-01-06 03:22:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:12.869499 | orchestrator | 2026-01-06 03:22:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:15.918709 | orchestrator | 2026-01-06 03:22:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:15.919040 | orchestrator | 2026-01-06 03:22:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:15.919066 | orchestrator | 2026-01-06 03:22:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:18.970012 | orchestrator | 2026-01-06 03:22:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:18.970678 | orchestrator | 2026-01-06 03:22:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:18.970717 | orchestrator | 2026-01-06 03:22:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:22.022411 | orchestrator | 2026-01-06 03:22:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:22.022535 | orchestrator | 2026-01-06 03:22:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:22.022548 | orchestrator | 2026-01-06 03:22:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:25.070466 | orchestrator | 2026-01-06 03:22:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:25.072669 | orchestrator | 2026-01-06 03:22:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:25.072726 | orchestrator | 2026-01-06 03:22:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:28.116181 | orchestrator | 2026-01-06 03:22:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:28.117998 | orchestrator | 2026-01-06 03:22:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:28.118133 | orchestrator | 2026-01-06 03:22:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:31.170787 | orchestrator | 2026-01-06 03:22:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:31.176739 | orchestrator | 2026-01-06 03:22:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:31.176829 | orchestrator | 2026-01-06 03:22:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:34.219940 | orchestrator | 2026-01-06 03:22:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:34.221078 | orchestrator | 2026-01-06 03:22:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:34.221525 | orchestrator | 2026-01-06 03:22:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:37.274646 | orchestrator | 2026-01-06 03:22:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:37.275822 | orchestrator | 2026-01-06 03:22:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:37.275865 | orchestrator | 2026-01-06 03:22:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:40.323841 | orchestrator | 2026-01-06 03:22:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:40.324355 | orchestrator | 2026-01-06 03:22:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:40.324385 | orchestrator | 2026-01-06 03:22:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:43.383772 | orchestrator | 2026-01-06 03:22:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:43.384933 | orchestrator | 2026-01-06 03:22:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:43.385452 | orchestrator | 2026-01-06 03:22:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:46.431359 | orchestrator | 2026-01-06 03:22:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:46.432149 | orchestrator | 2026-01-06 03:22:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:46.432179 | orchestrator | 2026-01-06 03:22:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:49.480742 | orchestrator | 2026-01-06 03:22:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:49.483971 | orchestrator | 2026-01-06 03:22:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:49.484085 | orchestrator | 2026-01-06 03:22:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:52.534390 | orchestrator | 2026-01-06 03:22:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:52.536958 | orchestrator | 2026-01-06 03:22:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:52.537022 | orchestrator | 2026-01-06 03:22:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:55.581228 | orchestrator | 2026-01-06 03:22:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:55.584071 | orchestrator | 2026-01-06 03:22:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:55.584189 | orchestrator | 2026-01-06 03:22:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:22:58.628258 | orchestrator | 2026-01-06 03:22:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:22:58.629136 | orchestrator | 2026-01-06 03:22:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:22:58.629211 | orchestrator | 2026-01-06 03:22:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:01.678623 | orchestrator | 2026-01-06 03:23:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:01.681129 | orchestrator | 2026-01-06 03:23:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:01.681207 | orchestrator | 2026-01-06 03:23:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:04.733717 | orchestrator | 2026-01-06 03:23:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:04.735336 | orchestrator | 2026-01-06 03:23:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:04.735888 | orchestrator | 2026-01-06 03:23:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:07.788353 | orchestrator | 2026-01-06 03:23:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:07.790800 | orchestrator | 2026-01-06 03:23:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:07.790875 | orchestrator | 2026-01-06 03:23:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:10.842651 | orchestrator | 2026-01-06 03:23:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:10.844126 | orchestrator | 2026-01-06 03:23:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:10.844419 | orchestrator | 2026-01-06 03:23:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:13.901082 | orchestrator | 2026-01-06 03:23:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:13.902664 | orchestrator | 2026-01-06 03:23:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:13.902717 | orchestrator | 2026-01-06 03:23:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:16.953973 | orchestrator | 2026-01-06 03:23:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:16.955377 | orchestrator | 2026-01-06 03:23:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:16.955437 | orchestrator | 2026-01-06 03:23:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:20.015620 | orchestrator | 2026-01-06 03:23:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:20.018636 | orchestrator | 2026-01-06 03:23:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:20.018715 | orchestrator | 2026-01-06 03:23:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:23.069258 | orchestrator | 2026-01-06 03:23:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:23.070772 | orchestrator | 2026-01-06 03:23:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:23.070901 | orchestrator | 2026-01-06 03:23:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:26.119330 | orchestrator | 2026-01-06 03:23:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:26.121852 | orchestrator | 2026-01-06 03:23:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:26.121918 | orchestrator | 2026-01-06 03:23:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:29.176813 | orchestrator | 2026-01-06 03:23:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:29.178162 | orchestrator | 2026-01-06 03:23:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:29.178217 | orchestrator | 2026-01-06 03:23:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:32.229187 | orchestrator | 2026-01-06 03:23:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:32.231711 | orchestrator | 2026-01-06 03:23:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:32.231790 | orchestrator | 2026-01-06 03:23:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:35.281546 | orchestrator | 2026-01-06 03:23:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:35.283111 | orchestrator | 2026-01-06 03:23:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:35.283168 | orchestrator | 2026-01-06 03:23:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:38.336036 | orchestrator | 2026-01-06 03:23:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:38.337137 | orchestrator | 2026-01-06 03:23:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:38.337254 | orchestrator | 2026-01-06 03:23:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:41.384125 | orchestrator | 2026-01-06 03:23:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:41.388314 | orchestrator | 2026-01-06 03:23:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:41.388380 | orchestrator | 2026-01-06 03:23:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:44.434575 | orchestrator | 2026-01-06 03:23:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:44.437816 | orchestrator | 2026-01-06 03:23:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:44.437882 | orchestrator | 2026-01-06 03:23:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:47.487972 | orchestrator | 2026-01-06 03:23:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:47.489362 | orchestrator | 2026-01-06 03:23:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:47.489820 | orchestrator | 2026-01-06 03:23:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:50.536822 | orchestrator | 2026-01-06 03:23:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:50.540420 | orchestrator | 2026-01-06 03:23:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:50.540508 | orchestrator | 2026-01-06 03:23:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:53.589381 | orchestrator | 2026-01-06 03:23:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:53.590984 | orchestrator | 2026-01-06 03:23:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:53.591035 | orchestrator | 2026-01-06 03:23:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:56.632221 | orchestrator | 2026-01-06 03:23:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:56.633423 | orchestrator | 2026-01-06 03:23:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:56.633558 | orchestrator | 2026-01-06 03:23:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:23:59.685258 | orchestrator | 2026-01-06 03:23:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:23:59.686472 | orchestrator | 2026-01-06 03:23:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:23:59.686503 | orchestrator | 2026-01-06 03:23:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:02.736346 | orchestrator | 2026-01-06 03:24:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:02.737143 | orchestrator | 2026-01-06 03:24:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:02.737180 | orchestrator | 2026-01-06 03:24:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:05.790365 | orchestrator | 2026-01-06 03:24:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:05.791966 | orchestrator | 2026-01-06 03:24:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:05.792064 | orchestrator | 2026-01-06 03:24:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:08.843086 | orchestrator | 2026-01-06 03:24:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:08.845733 | orchestrator | 2026-01-06 03:24:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:08.845778 | orchestrator | 2026-01-06 03:24:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:11.892939 | orchestrator | 2026-01-06 03:24:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:11.894762 | orchestrator | 2026-01-06 03:24:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:11.895016 | orchestrator | 2026-01-06 03:24:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:14.947016 | orchestrator | 2026-01-06 03:24:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:14.948209 | orchestrator | 2026-01-06 03:24:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:14.948365 | orchestrator | 2026-01-06 03:24:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:17.995726 | orchestrator | 2026-01-06 03:24:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:17.997640 | orchestrator | 2026-01-06 03:24:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:17.997910 | orchestrator | 2026-01-06 03:24:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:21.053422 | orchestrator | 2026-01-06 03:24:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:21.056203 | orchestrator | 2026-01-06 03:24:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:21.056308 | orchestrator | 2026-01-06 03:24:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:24.102516 | orchestrator | 2026-01-06 03:24:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:24.103217 | orchestrator | 2026-01-06 03:24:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:24.103252 | orchestrator | 2026-01-06 03:24:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:27.156591 | orchestrator | 2026-01-06 03:24:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:27.158431 | orchestrator | 2026-01-06 03:24:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:27.158593 | orchestrator | 2026-01-06 03:24:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:30.211976 | orchestrator | 2026-01-06 03:24:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:30.213112 | orchestrator | 2026-01-06 03:24:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:30.213154 | orchestrator | 2026-01-06 03:24:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:33.262980 | orchestrator | 2026-01-06 03:24:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:33.265181 | orchestrator | 2026-01-06 03:24:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:33.265353 | orchestrator | 2026-01-06 03:24:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:36.316028 | orchestrator | 2026-01-06 03:24:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:36.316931 | orchestrator | 2026-01-06 03:24:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:36.317075 | orchestrator | 2026-01-06 03:24:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:39.370411 | orchestrator | 2026-01-06 03:24:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:39.373097 | orchestrator | 2026-01-06 03:24:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:39.373162 | orchestrator | 2026-01-06 03:24:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:42.426894 | orchestrator | 2026-01-06 03:24:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:42.428587 | orchestrator | 2026-01-06 03:24:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:42.428685 | orchestrator | 2026-01-06 03:24:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:45.477504 | orchestrator | 2026-01-06 03:24:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:45.479917 | orchestrator | 2026-01-06 03:24:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:45.479997 | orchestrator | 2026-01-06 03:24:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:48.521237 | orchestrator | 2026-01-06 03:24:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:48.524137 | orchestrator | 2026-01-06 03:24:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:48.524346 | orchestrator | 2026-01-06 03:24:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:51.573492 | orchestrator | 2026-01-06 03:24:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:51.574868 | orchestrator | 2026-01-06 03:24:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:51.574890 | orchestrator | 2026-01-06 03:24:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:54.620478 | orchestrator | 2026-01-06 03:24:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:54.622269 | orchestrator | 2026-01-06 03:24:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:54.622357 | orchestrator | 2026-01-06 03:24:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:24:57.667106 | orchestrator | 2026-01-06 03:24:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:24:57.669461 | orchestrator | 2026-01-06 03:24:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:24:57.669554 | orchestrator | 2026-01-06 03:24:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:00.715213 | orchestrator | 2026-01-06 03:25:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:00.717930 | orchestrator | 2026-01-06 03:25:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:00.718157 | orchestrator | 2026-01-06 03:25:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:03.773224 | orchestrator | 2026-01-06 03:25:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:03.775606 | orchestrator | 2026-01-06 03:25:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:03.775683 | orchestrator | 2026-01-06 03:25:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:06.829173 | orchestrator | 2026-01-06 03:25:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:06.830476 | orchestrator | 2026-01-06 03:25:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:06.830523 | orchestrator | 2026-01-06 03:25:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:09.883942 | orchestrator | 2026-01-06 03:25:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:09.886990 | orchestrator | 2026-01-06 03:25:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:09.887125 | orchestrator | 2026-01-06 03:25:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:12.938286 | orchestrator | 2026-01-06 03:25:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:12.939483 | orchestrator | 2026-01-06 03:25:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:12.939521 | orchestrator | 2026-01-06 03:25:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:15.988765 | orchestrator | 2026-01-06 03:25:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:15.989553 | orchestrator | 2026-01-06 03:25:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:15.989578 | orchestrator | 2026-01-06 03:25:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:19.043217 | orchestrator | 2026-01-06 03:25:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:19.046534 | orchestrator | 2026-01-06 03:25:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:19.046619 | orchestrator | 2026-01-06 03:25:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:22.094483 | orchestrator | 2026-01-06 03:25:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:22.096732 | orchestrator | 2026-01-06 03:25:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:22.097178 | orchestrator | 2026-01-06 03:25:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:25.148182 | orchestrator | 2026-01-06 03:25:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:25.150595 | orchestrator | 2026-01-06 03:25:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:25.150654 | orchestrator | 2026-01-06 03:25:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:28.195520 | orchestrator | 2026-01-06 03:25:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:28.198441 | orchestrator | 2026-01-06 03:25:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:28.198676 | orchestrator | 2026-01-06 03:25:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:31.244781 | orchestrator | 2026-01-06 03:25:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:31.246581 | orchestrator | 2026-01-06 03:25:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:31.246655 | orchestrator | 2026-01-06 03:25:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:34.295754 | orchestrator | 2026-01-06 03:25:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:34.296726 | orchestrator | 2026-01-06 03:25:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:34.296754 | orchestrator | 2026-01-06 03:25:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:37.346244 | orchestrator | 2026-01-06 03:25:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:37.347850 | orchestrator | 2026-01-06 03:25:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:37.347967 | orchestrator | 2026-01-06 03:25:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:40.398296 | orchestrator | 2026-01-06 03:25:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:40.400009 | orchestrator | 2026-01-06 03:25:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:40.400120 | orchestrator | 2026-01-06 03:25:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:43.443210 | orchestrator | 2026-01-06 03:25:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:43.445383 | orchestrator | 2026-01-06 03:25:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:43.445467 | orchestrator | 2026-01-06 03:25:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:46.498370 | orchestrator | 2026-01-06 03:25:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:46.500654 | orchestrator | 2026-01-06 03:25:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:46.500695 | orchestrator | 2026-01-06 03:25:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:49.555192 | orchestrator | 2026-01-06 03:25:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:49.557344 | orchestrator | 2026-01-06 03:25:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:49.557479 | orchestrator | 2026-01-06 03:25:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:52.602338 | orchestrator | 2026-01-06 03:25:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:52.604211 | orchestrator | 2026-01-06 03:25:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:52.604265 | orchestrator | 2026-01-06 03:25:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:55.652324 | orchestrator | 2026-01-06 03:25:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:55.653479 | orchestrator | 2026-01-06 03:25:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:55.653520 | orchestrator | 2026-01-06 03:25:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:25:58.703990 | orchestrator | 2026-01-06 03:25:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:25:58.706090 | orchestrator | 2026-01-06 03:25:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:25:58.706170 | orchestrator | 2026-01-06 03:25:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:01.755898 | orchestrator | 2026-01-06 03:26:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:01.757880 | orchestrator | 2026-01-06 03:26:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:01.757952 | orchestrator | 2026-01-06 03:26:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:04.803468 | orchestrator | 2026-01-06 03:26:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:04.805083 | orchestrator | 2026-01-06 03:26:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:04.805161 | orchestrator | 2026-01-06 03:26:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:07.854832 | orchestrator | 2026-01-06 03:26:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:07.856733 | orchestrator | 2026-01-06 03:26:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:07.856857 | orchestrator | 2026-01-06 03:26:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:10.905438 | orchestrator | 2026-01-06 03:26:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:10.907103 | orchestrator | 2026-01-06 03:26:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:10.907163 | orchestrator | 2026-01-06 03:26:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:13.953224 | orchestrator | 2026-01-06 03:26:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:13.953544 | orchestrator | 2026-01-06 03:26:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:13.953560 | orchestrator | 2026-01-06 03:26:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:16.997499 | orchestrator | 2026-01-06 03:26:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:16.999968 | orchestrator | 2026-01-06 03:26:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:17.000036 | orchestrator | 2026-01-06 03:26:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:20.042191 | orchestrator | 2026-01-06 03:26:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:20.044126 | orchestrator | 2026-01-06 03:26:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:20.044167 | orchestrator | 2026-01-06 03:26:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:23.098619 | orchestrator | 2026-01-06 03:26:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:23.100097 | orchestrator | 2026-01-06 03:26:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:23.100131 | orchestrator | 2026-01-06 03:26:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:26.150377 | orchestrator | 2026-01-06 03:26:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:26.153073 | orchestrator | 2026-01-06 03:26:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:26.153109 | orchestrator | 2026-01-06 03:26:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:29.203615 | orchestrator | 2026-01-06 03:26:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:29.204940 | orchestrator | 2026-01-06 03:26:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:29.204974 | orchestrator | 2026-01-06 03:26:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:32.254334 | orchestrator | 2026-01-06 03:26:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:32.257922 | orchestrator | 2026-01-06 03:26:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:32.258243 | orchestrator | 2026-01-06 03:26:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:35.314836 | orchestrator | 2026-01-06 03:26:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:35.317205 | orchestrator | 2026-01-06 03:26:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:35.317273 | orchestrator | 2026-01-06 03:26:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:38.367670 | orchestrator | 2026-01-06 03:26:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:38.368496 | orchestrator | 2026-01-06 03:26:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:38.368512 | orchestrator | 2026-01-06 03:26:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:41.419038 | orchestrator | 2026-01-06 03:26:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:41.421427 | orchestrator | 2026-01-06 03:26:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:41.421503 | orchestrator | 2026-01-06 03:26:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:44.476289 | orchestrator | 2026-01-06 03:26:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:44.478092 | orchestrator | 2026-01-06 03:26:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:44.478229 | orchestrator | 2026-01-06 03:26:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:47.531836 | orchestrator | 2026-01-06 03:26:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:47.534889 | orchestrator | 2026-01-06 03:26:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:47.534953 | orchestrator | 2026-01-06 03:26:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:50.586137 | orchestrator | 2026-01-06 03:26:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:50.587855 | orchestrator | 2026-01-06 03:26:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:50.587901 | orchestrator | 2026-01-06 03:26:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:53.642263 | orchestrator | 2026-01-06 03:26:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:53.644331 | orchestrator | 2026-01-06 03:26:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:53.644368 | orchestrator | 2026-01-06 03:26:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:56.704543 | orchestrator | 2026-01-06 03:26:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:56.705652 | orchestrator | 2026-01-06 03:26:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:56.705688 | orchestrator | 2026-01-06 03:26:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:26:59.758249 | orchestrator | 2026-01-06 03:26:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:26:59.759124 | orchestrator | 2026-01-06 03:26:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:26:59.759256 | orchestrator | 2026-01-06 03:26:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:02.812347 | orchestrator | 2026-01-06 03:27:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:02.814587 | orchestrator | 2026-01-06 03:27:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:02.814695 | orchestrator | 2026-01-06 03:27:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:05.864293 | orchestrator | 2026-01-06 03:27:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:05.865809 | orchestrator | 2026-01-06 03:27:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:05.865951 | orchestrator | 2026-01-06 03:27:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:08.914847 | orchestrator | 2026-01-06 03:27:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:08.916021 | orchestrator | 2026-01-06 03:27:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:08.916090 | orchestrator | 2026-01-06 03:27:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:11.965171 | orchestrator | 2026-01-06 03:27:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:11.966363 | orchestrator | 2026-01-06 03:27:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:11.966561 | orchestrator | 2026-01-06 03:27:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:15.030472 | orchestrator | 2026-01-06 03:27:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:15.032336 | orchestrator | 2026-01-06 03:27:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:15.032373 | orchestrator | 2026-01-06 03:27:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:18.074781 | orchestrator | 2026-01-06 03:27:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:18.075181 | orchestrator | 2026-01-06 03:27:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:18.075231 | orchestrator | 2026-01-06 03:27:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:21.121785 | orchestrator | 2026-01-06 03:27:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:21.123283 | orchestrator | 2026-01-06 03:27:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:21.123362 | orchestrator | 2026-01-06 03:27:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:24.169523 | orchestrator | 2026-01-06 03:27:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:24.171480 | orchestrator | 2026-01-06 03:27:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:24.171556 | orchestrator | 2026-01-06 03:27:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:27.221155 | orchestrator | 2026-01-06 03:27:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:27.222170 | orchestrator | 2026-01-06 03:27:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:27.222200 | orchestrator | 2026-01-06 03:27:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:30.270340 | orchestrator | 2026-01-06 03:27:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:30.272474 | orchestrator | 2026-01-06 03:27:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:30.272556 | orchestrator | 2026-01-06 03:27:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:33.322769 | orchestrator | 2026-01-06 03:27:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:33.325555 | orchestrator | 2026-01-06 03:27:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:33.325615 | orchestrator | 2026-01-06 03:27:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:36.363897 | orchestrator | 2026-01-06 03:27:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:36.365544 | orchestrator | 2026-01-06 03:27:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:36.365622 | orchestrator | 2026-01-06 03:27:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:39.414170 | orchestrator | 2026-01-06 03:27:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:39.414671 | orchestrator | 2026-01-06 03:27:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:39.414714 | orchestrator | 2026-01-06 03:27:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:42.458692 | orchestrator | 2026-01-06 03:27:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:42.460802 | orchestrator | 2026-01-06 03:27:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:42.460850 | orchestrator | 2026-01-06 03:27:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:45.509707 | orchestrator | 2026-01-06 03:27:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:45.511271 | orchestrator | 2026-01-06 03:27:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:45.511305 | orchestrator | 2026-01-06 03:27:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:48.560776 | orchestrator | 2026-01-06 03:27:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:48.563023 | orchestrator | 2026-01-06 03:27:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:48.563081 | orchestrator | 2026-01-06 03:27:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:51.615775 | orchestrator | 2026-01-06 03:27:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:51.617601 | orchestrator | 2026-01-06 03:27:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:51.617655 | orchestrator | 2026-01-06 03:27:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:54.665939 | orchestrator | 2026-01-06 03:27:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:54.666586 | orchestrator | 2026-01-06 03:27:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:54.666848 | orchestrator | 2026-01-06 03:27:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:27:57.710972 | orchestrator | 2026-01-06 03:27:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:27:57.713232 | orchestrator | 2026-01-06 03:27:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:27:57.713903 | orchestrator | 2026-01-06 03:27:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:00.752128 | orchestrator | 2026-01-06 03:28:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:00.752535 | orchestrator | 2026-01-06 03:28:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:00.752565 | orchestrator | 2026-01-06 03:28:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:03.802837 | orchestrator | 2026-01-06 03:28:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:03.807566 | orchestrator | 2026-01-06 03:28:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:03.807656 | orchestrator | 2026-01-06 03:28:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:06.854791 | orchestrator | 2026-01-06 03:28:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:06.856697 | orchestrator | 2026-01-06 03:28:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:06.856756 | orchestrator | 2026-01-06 03:28:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:09.901459 | orchestrator | 2026-01-06 03:28:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:09.905246 | orchestrator | 2026-01-06 03:28:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:09.905352 | orchestrator | 2026-01-06 03:28:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:12.953474 | orchestrator | 2026-01-06 03:28:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:12.953940 | orchestrator | 2026-01-06 03:28:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:12.953976 | orchestrator | 2026-01-06 03:28:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:15.998188 | orchestrator | 2026-01-06 03:28:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:16.000686 | orchestrator | 2026-01-06 03:28:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:16.000729 | orchestrator | 2026-01-06 03:28:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:19.052572 | orchestrator | 2026-01-06 03:28:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:19.053312 | orchestrator | 2026-01-06 03:28:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:19.053560 | orchestrator | 2026-01-06 03:28:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:22.110241 | orchestrator | 2026-01-06 03:28:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:22.112203 | orchestrator | 2026-01-06 03:28:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:22.112249 | orchestrator | 2026-01-06 03:28:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:25.154451 | orchestrator | 2026-01-06 03:28:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:25.156328 | orchestrator | 2026-01-06 03:28:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:25.156490 | orchestrator | 2026-01-06 03:28:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:28.202958 | orchestrator | 2026-01-06 03:28:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:28.204162 | orchestrator | 2026-01-06 03:28:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:28.204197 | orchestrator | 2026-01-06 03:28:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:31.252312 | orchestrator | 2026-01-06 03:28:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:31.254990 | orchestrator | 2026-01-06 03:28:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:31.255084 | orchestrator | 2026-01-06 03:28:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:34.299941 | orchestrator | 2026-01-06 03:28:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:34.301987 | orchestrator | 2026-01-06 03:28:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:34.302156 | orchestrator | 2026-01-06 03:28:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:37.349663 | orchestrator | 2026-01-06 03:28:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:37.351650 | orchestrator | 2026-01-06 03:28:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:37.351774 | orchestrator | 2026-01-06 03:28:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:40.398945 | orchestrator | 2026-01-06 03:28:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:40.400156 | orchestrator | 2026-01-06 03:28:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:40.400303 | orchestrator | 2026-01-06 03:28:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:43.448008 | orchestrator | 2026-01-06 03:28:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:43.449851 | orchestrator | 2026-01-06 03:28:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:43.449901 | orchestrator | 2026-01-06 03:28:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:46.497791 | orchestrator | 2026-01-06 03:28:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:46.500515 | orchestrator | 2026-01-06 03:28:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:46.500593 | orchestrator | 2026-01-06 03:28:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:49.545792 | orchestrator | 2026-01-06 03:28:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:49.547974 | orchestrator | 2026-01-06 03:28:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:49.548019 | orchestrator | 2026-01-06 03:28:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:52.597974 | orchestrator | 2026-01-06 03:28:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:52.599393 | orchestrator | 2026-01-06 03:28:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:52.599457 | orchestrator | 2026-01-06 03:28:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:55.646405 | orchestrator | 2026-01-06 03:28:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:55.648465 | orchestrator | 2026-01-06 03:28:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:55.648618 | orchestrator | 2026-01-06 03:28:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:28:58.688748 | orchestrator | 2026-01-06 03:28:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:28:58.690152 | orchestrator | 2026-01-06 03:28:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:28:58.690842 | orchestrator | 2026-01-06 03:28:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:01.739137 | orchestrator | 2026-01-06 03:29:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:01.740869 | orchestrator | 2026-01-06 03:29:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:01.740921 | orchestrator | 2026-01-06 03:29:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:04.788057 | orchestrator | 2026-01-06 03:29:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:04.789324 | orchestrator | 2026-01-06 03:29:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:04.789501 | orchestrator | 2026-01-06 03:29:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:07.834790 | orchestrator | 2026-01-06 03:29:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:07.836001 | orchestrator | 2026-01-06 03:29:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:07.836039 | orchestrator | 2026-01-06 03:29:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:10.881735 | orchestrator | 2026-01-06 03:29:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:10.883406 | orchestrator | 2026-01-06 03:29:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:10.883558 | orchestrator | 2026-01-06 03:29:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:13.925149 | orchestrator | 2026-01-06 03:29:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:13.927016 | orchestrator | 2026-01-06 03:29:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:13.927052 | orchestrator | 2026-01-06 03:29:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:16.974670 | orchestrator | 2026-01-06 03:29:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:16.976290 | orchestrator | 2026-01-06 03:29:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:16.976340 | orchestrator | 2026-01-06 03:29:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:20.024239 | orchestrator | 2026-01-06 03:29:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:20.026328 | orchestrator | 2026-01-06 03:29:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:20.026370 | orchestrator | 2026-01-06 03:29:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:23.078573 | orchestrator | 2026-01-06 03:29:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:23.079646 | orchestrator | 2026-01-06 03:29:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:23.079763 | orchestrator | 2026-01-06 03:29:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:26.121988 | orchestrator | 2026-01-06 03:29:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:26.124076 | orchestrator | 2026-01-06 03:29:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:26.124210 | orchestrator | 2026-01-06 03:29:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:29.178796 | orchestrator | 2026-01-06 03:29:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:29.180749 | orchestrator | 2026-01-06 03:29:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:29.180796 | orchestrator | 2026-01-06 03:29:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:32.231956 | orchestrator | 2026-01-06 03:29:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:32.233176 | orchestrator | 2026-01-06 03:29:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:32.233252 | orchestrator | 2026-01-06 03:29:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:35.279920 | orchestrator | 2026-01-06 03:29:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:35.280854 | orchestrator | 2026-01-06 03:29:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:35.281054 | orchestrator | 2026-01-06 03:29:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:38.328556 | orchestrator | 2026-01-06 03:29:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:38.330771 | orchestrator | 2026-01-06 03:29:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:38.330874 | orchestrator | 2026-01-06 03:29:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:41.375631 | orchestrator | 2026-01-06 03:29:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:41.377369 | orchestrator | 2026-01-06 03:29:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:41.377434 | orchestrator | 2026-01-06 03:29:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:44.422596 | orchestrator | 2026-01-06 03:29:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:44.424404 | orchestrator | 2026-01-06 03:29:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:44.424553 | orchestrator | 2026-01-06 03:29:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:47.467193 | orchestrator | 2026-01-06 03:29:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:47.469764 | orchestrator | 2026-01-06 03:29:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:47.469826 | orchestrator | 2026-01-06 03:29:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:50.518610 | orchestrator | 2026-01-06 03:29:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:50.520772 | orchestrator | 2026-01-06 03:29:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:50.520870 | orchestrator | 2026-01-06 03:29:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:53.571971 | orchestrator | 2026-01-06 03:29:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:53.572739 | orchestrator | 2026-01-06 03:29:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:53.572914 | orchestrator | 2026-01-06 03:29:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:56.622313 | orchestrator | 2026-01-06 03:29:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:56.624830 | orchestrator | 2026-01-06 03:29:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:56.624912 | orchestrator | 2026-01-06 03:29:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:29:59.674218 | orchestrator | 2026-01-06 03:29:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:29:59.676469 | orchestrator | 2026-01-06 03:29:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:29:59.676569 | orchestrator | 2026-01-06 03:29:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:02.725059 | orchestrator | 2026-01-06 03:30:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:02.726755 | orchestrator | 2026-01-06 03:30:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:02.726969 | orchestrator | 2026-01-06 03:30:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:05.776951 | orchestrator | 2026-01-06 03:30:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:05.778283 | orchestrator | 2026-01-06 03:30:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:05.778752 | orchestrator | 2026-01-06 03:30:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:08.824979 | orchestrator | 2026-01-06 03:30:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:08.827342 | orchestrator | 2026-01-06 03:30:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:08.827433 | orchestrator | 2026-01-06 03:30:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:11.869874 | orchestrator | 2026-01-06 03:30:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:11.872484 | orchestrator | 2026-01-06 03:30:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:11.872546 | orchestrator | 2026-01-06 03:30:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:14.922586 | orchestrator | 2026-01-06 03:30:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:14.923091 | orchestrator | 2026-01-06 03:30:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:14.923189 | orchestrator | 2026-01-06 03:30:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:17.978451 | orchestrator | 2026-01-06 03:30:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:17.978783 | orchestrator | 2026-01-06 03:30:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:17.978819 | orchestrator | 2026-01-06 03:30:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:21.025606 | orchestrator | 2026-01-06 03:30:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:21.026406 | orchestrator | 2026-01-06 03:30:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:21.026501 | orchestrator | 2026-01-06 03:30:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:24.083912 | orchestrator | 2026-01-06 03:30:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:24.086664 | orchestrator | 2026-01-06 03:30:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:24.088214 | orchestrator | 2026-01-06 03:30:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:27.134704 | orchestrator | 2026-01-06 03:30:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:27.136221 | orchestrator | 2026-01-06 03:30:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:27.136333 | orchestrator | 2026-01-06 03:30:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:30.181984 | orchestrator | 2026-01-06 03:30:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:30.182646 | orchestrator | 2026-01-06 03:30:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:30.182674 | orchestrator | 2026-01-06 03:30:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:33.232339 | orchestrator | 2026-01-06 03:30:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:33.233202 | orchestrator | 2026-01-06 03:30:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:33.233297 | orchestrator | 2026-01-06 03:30:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:36.285437 | orchestrator | 2026-01-06 03:30:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:36.285942 | orchestrator | 2026-01-06 03:30:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:36.285986 | orchestrator | 2026-01-06 03:30:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:39.328747 | orchestrator | 2026-01-06 03:30:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:39.328933 | orchestrator | 2026-01-06 03:30:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:39.328954 | orchestrator | 2026-01-06 03:30:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:42.370726 | orchestrator | 2026-01-06 03:30:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:42.371602 | orchestrator | 2026-01-06 03:30:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:42.371627 | orchestrator | 2026-01-06 03:30:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:45.421395 | orchestrator | 2026-01-06 03:30:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:45.423103 | orchestrator | 2026-01-06 03:30:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:45.423206 | orchestrator | 2026-01-06 03:30:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:48.471442 | orchestrator | 2026-01-06 03:30:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:48.472184 | orchestrator | 2026-01-06 03:30:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:48.472218 | orchestrator | 2026-01-06 03:30:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:51.515673 | orchestrator | 2026-01-06 03:30:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:51.517181 | orchestrator | 2026-01-06 03:30:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:51.517215 | orchestrator | 2026-01-06 03:30:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:54.565636 | orchestrator | 2026-01-06 03:30:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:54.567766 | orchestrator | 2026-01-06 03:30:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:54.567865 | orchestrator | 2026-01-06 03:30:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:30:57.613823 | orchestrator | 2026-01-06 03:30:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:30:57.615305 | orchestrator | 2026-01-06 03:30:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:30:57.615357 | orchestrator | 2026-01-06 03:30:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:00.663000 | orchestrator | 2026-01-06 03:31:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:00.666415 | orchestrator | 2026-01-06 03:31:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:00.666475 | orchestrator | 2026-01-06 03:31:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:03.715295 | orchestrator | 2026-01-06 03:31:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:03.716642 | orchestrator | 2026-01-06 03:31:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:03.716834 | orchestrator | 2026-01-06 03:31:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:06.768487 | orchestrator | 2026-01-06 03:31:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:06.769129 | orchestrator | 2026-01-06 03:31:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:06.769290 | orchestrator | 2026-01-06 03:31:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:09.823186 | orchestrator | 2026-01-06 03:31:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:09.824000 | orchestrator | 2026-01-06 03:31:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:09.824060 | orchestrator | 2026-01-06 03:31:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:12.873904 | orchestrator | 2026-01-06 03:31:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:12.875036 | orchestrator | 2026-01-06 03:31:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:12.875161 | orchestrator | 2026-01-06 03:31:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:15.925710 | orchestrator | 2026-01-06 03:31:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:15.926529 | orchestrator | 2026-01-06 03:31:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:15.926610 | orchestrator | 2026-01-06 03:31:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:18.973199 | orchestrator | 2026-01-06 03:31:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:18.974791 | orchestrator | 2026-01-06 03:31:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:18.974850 | orchestrator | 2026-01-06 03:31:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:22.013747 | orchestrator | 2026-01-06 03:31:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:22.015182 | orchestrator | 2026-01-06 03:31:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:22.015325 | orchestrator | 2026-01-06 03:31:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:25.057792 | orchestrator | 2026-01-06 03:31:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:25.060351 | orchestrator | 2026-01-06 03:31:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:25.060374 | orchestrator | 2026-01-06 03:31:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:28.106889 | orchestrator | 2026-01-06 03:31:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:28.108852 | orchestrator | 2026-01-06 03:31:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:28.108888 | orchestrator | 2026-01-06 03:31:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:31.158854 | orchestrator | 2026-01-06 03:31:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:31.159781 | orchestrator | 2026-01-06 03:31:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:31.159822 | orchestrator | 2026-01-06 03:31:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:34.209522 | orchestrator | 2026-01-06 03:31:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:34.211620 | orchestrator | 2026-01-06 03:31:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:34.211711 | orchestrator | 2026-01-06 03:31:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:37.266455 | orchestrator | 2026-01-06 03:31:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:37.268758 | orchestrator | 2026-01-06 03:31:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:37.268830 | orchestrator | 2026-01-06 03:31:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:40.317998 | orchestrator | 2026-01-06 03:31:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:40.319046 | orchestrator | 2026-01-06 03:31:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:40.319147 | orchestrator | 2026-01-06 03:31:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:43.370495 | orchestrator | 2026-01-06 03:31:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:43.370746 | orchestrator | 2026-01-06 03:31:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:43.370779 | orchestrator | 2026-01-06 03:31:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:46.416427 | orchestrator | 2026-01-06 03:31:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:46.419731 | orchestrator | 2026-01-06 03:31:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:46.419783 | orchestrator | 2026-01-06 03:31:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:49.467688 | orchestrator | 2026-01-06 03:31:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:49.469838 | orchestrator | 2026-01-06 03:31:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:49.469935 | orchestrator | 2026-01-06 03:31:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:52.513220 | orchestrator | 2026-01-06 03:31:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:52.513927 | orchestrator | 2026-01-06 03:31:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:52.513970 | orchestrator | 2026-01-06 03:31:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:55.562665 | orchestrator | 2026-01-06 03:31:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:55.563188 | orchestrator | 2026-01-06 03:31:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:55.563538 | orchestrator | 2026-01-06 03:31:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:31:58.608236 | orchestrator | 2026-01-06 03:31:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:31:58.612340 | orchestrator | 2026-01-06 03:31:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:31:58.612450 | orchestrator | 2026-01-06 03:31:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:01.663630 | orchestrator | 2026-01-06 03:32:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:01.665704 | orchestrator | 2026-01-06 03:32:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:01.665782 | orchestrator | 2026-01-06 03:32:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:04.719901 | orchestrator | 2026-01-06 03:32:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:04.720973 | orchestrator | 2026-01-06 03:32:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:04.721179 | orchestrator | 2026-01-06 03:32:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:07.766334 | orchestrator | 2026-01-06 03:32:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:07.766941 | orchestrator | 2026-01-06 03:32:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:07.766981 | orchestrator | 2026-01-06 03:32:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:10.818076 | orchestrator | 2026-01-06 03:32:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:10.819712 | orchestrator | 2026-01-06 03:32:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:10.819735 | orchestrator | 2026-01-06 03:32:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:13.871766 | orchestrator | 2026-01-06 03:32:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:13.872627 | orchestrator | 2026-01-06 03:32:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:13.872711 | orchestrator | 2026-01-06 03:32:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:16.923539 | orchestrator | 2026-01-06 03:32:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:16.925112 | orchestrator | 2026-01-06 03:32:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:16.925149 | orchestrator | 2026-01-06 03:32:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:19.974340 | orchestrator | 2026-01-06 03:32:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:19.976413 | orchestrator | 2026-01-06 03:32:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:19.976473 | orchestrator | 2026-01-06 03:32:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:23.022447 | orchestrator | 2026-01-06 03:32:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:23.023476 | orchestrator | 2026-01-06 03:32:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:23.023528 | orchestrator | 2026-01-06 03:32:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:26.081287 | orchestrator | 2026-01-06 03:32:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:26.082995 | orchestrator | 2026-01-06 03:32:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:26.083086 | orchestrator | 2026-01-06 03:32:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:29.130113 | orchestrator | 2026-01-06 03:32:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:29.132106 | orchestrator | 2026-01-06 03:32:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:29.132259 | orchestrator | 2026-01-06 03:32:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:32.178116 | orchestrator | 2026-01-06 03:32:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:32.179718 | orchestrator | 2026-01-06 03:32:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:32.179796 | orchestrator | 2026-01-06 03:32:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:35.227586 | orchestrator | 2026-01-06 03:32:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:35.229225 | orchestrator | 2026-01-06 03:32:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:35.229316 | orchestrator | 2026-01-06 03:32:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:38.278138 | orchestrator | 2026-01-06 03:32:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:38.280138 | orchestrator | 2026-01-06 03:32:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:38.280274 | orchestrator | 2026-01-06 03:32:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:41.329526 | orchestrator | 2026-01-06 03:32:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:41.331579 | orchestrator | 2026-01-06 03:32:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:41.331721 | orchestrator | 2026-01-06 03:32:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:44.380111 | orchestrator | 2026-01-06 03:32:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:44.381278 | orchestrator | 2026-01-06 03:32:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:44.381330 | orchestrator | 2026-01-06 03:32:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:47.422136 | orchestrator | 2026-01-06 03:32:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:47.423947 | orchestrator | 2026-01-06 03:32:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:47.424118 | orchestrator | 2026-01-06 03:32:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:50.470277 | orchestrator | 2026-01-06 03:32:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:50.471544 | orchestrator | 2026-01-06 03:32:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:50.471582 | orchestrator | 2026-01-06 03:32:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:53.518399 | orchestrator | 2026-01-06 03:32:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:53.520071 | orchestrator | 2026-01-06 03:32:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:53.520191 | orchestrator | 2026-01-06 03:32:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:56.565013 | orchestrator | 2026-01-06 03:32:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:56.567431 | orchestrator | 2026-01-06 03:32:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:56.567529 | orchestrator | 2026-01-06 03:32:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:32:59.611171 | orchestrator | 2026-01-06 03:32:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:32:59.612635 | orchestrator | 2026-01-06 03:32:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:32:59.612660 | orchestrator | 2026-01-06 03:32:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:02.661085 | orchestrator | 2026-01-06 03:33:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:02.663398 | orchestrator | 2026-01-06 03:33:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:02.663503 | orchestrator | 2026-01-06 03:33:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:05.709064 | orchestrator | 2026-01-06 03:33:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:05.711590 | orchestrator | 2026-01-06 03:33:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:05.711674 | orchestrator | 2026-01-06 03:33:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:08.762294 | orchestrator | 2026-01-06 03:33:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:08.763426 | orchestrator | 2026-01-06 03:33:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:08.763459 | orchestrator | 2026-01-06 03:33:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:11.814386 | orchestrator | 2026-01-06 03:33:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:11.815870 | orchestrator | 2026-01-06 03:33:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:11.816044 | orchestrator | 2026-01-06 03:33:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:14.862232 | orchestrator | 2026-01-06 03:33:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:14.863767 | orchestrator | 2026-01-06 03:33:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:14.863928 | orchestrator | 2026-01-06 03:33:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:17.909614 | orchestrator | 2026-01-06 03:33:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:17.910745 | orchestrator | 2026-01-06 03:33:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:17.910847 | orchestrator | 2026-01-06 03:33:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:20.960096 | orchestrator | 2026-01-06 03:33:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:20.961424 | orchestrator | 2026-01-06 03:33:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:20.961463 | orchestrator | 2026-01-06 03:33:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:24.012602 | orchestrator | 2026-01-06 03:33:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:24.016529 | orchestrator | 2026-01-06 03:33:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:24.016602 | orchestrator | 2026-01-06 03:33:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:27.063523 | orchestrator | 2026-01-06 03:33:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:27.065872 | orchestrator | 2026-01-06 03:33:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:27.065968 | orchestrator | 2026-01-06 03:33:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:30.112361 | orchestrator | 2026-01-06 03:33:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:30.114756 | orchestrator | 2026-01-06 03:33:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:30.114819 | orchestrator | 2026-01-06 03:33:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:33.162600 | orchestrator | 2026-01-06 03:33:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:33.164056 | orchestrator | 2026-01-06 03:33:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:33.164160 | orchestrator | 2026-01-06 03:33:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:36.210094 | orchestrator | 2026-01-06 03:33:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:36.213195 | orchestrator | 2026-01-06 03:33:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:36.213275 | orchestrator | 2026-01-06 03:33:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:39.263149 | orchestrator | 2026-01-06 03:33:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:39.264373 | orchestrator | 2026-01-06 03:33:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:39.264396 | orchestrator | 2026-01-06 03:33:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:42.314500 | orchestrator | 2026-01-06 03:33:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:42.315854 | orchestrator | 2026-01-06 03:33:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:42.315888 | orchestrator | 2026-01-06 03:33:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:45.370710 | orchestrator | 2026-01-06 03:33:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:45.372166 | orchestrator | 2026-01-06 03:33:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:45.372224 | orchestrator | 2026-01-06 03:33:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:48.420909 | orchestrator | 2026-01-06 03:33:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:48.423512 | orchestrator | 2026-01-06 03:33:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:48.423595 | orchestrator | 2026-01-06 03:33:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:51.472244 | orchestrator | 2026-01-06 03:33:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:51.473713 | orchestrator | 2026-01-06 03:33:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:51.474106 | orchestrator | 2026-01-06 03:33:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:54.518542 | orchestrator | 2026-01-06 03:33:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:54.520082 | orchestrator | 2026-01-06 03:33:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:54.520243 | orchestrator | 2026-01-06 03:33:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:33:57.566409 | orchestrator | 2026-01-06 03:33:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:33:57.568380 | orchestrator | 2026-01-06 03:33:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:33:57.568431 | orchestrator | 2026-01-06 03:33:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:00.618339 | orchestrator | 2026-01-06 03:34:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:00.622122 | orchestrator | 2026-01-06 03:34:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:00.622596 | orchestrator | 2026-01-06 03:34:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:03.670509 | orchestrator | 2026-01-06 03:34:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:03.672044 | orchestrator | 2026-01-06 03:34:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:03.672079 | orchestrator | 2026-01-06 03:34:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:06.720878 | orchestrator | 2026-01-06 03:34:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:06.722312 | orchestrator | 2026-01-06 03:34:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:06.722352 | orchestrator | 2026-01-06 03:34:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:09.769022 | orchestrator | 2026-01-06 03:34:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:09.773575 | orchestrator | 2026-01-06 03:34:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:09.773614 | orchestrator | 2026-01-06 03:34:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:12.823228 | orchestrator | 2026-01-06 03:34:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:12.824769 | orchestrator | 2026-01-06 03:34:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:12.824856 | orchestrator | 2026-01-06 03:34:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:15.874994 | orchestrator | 2026-01-06 03:34:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:15.876175 | orchestrator | 2026-01-06 03:34:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:15.876229 | orchestrator | 2026-01-06 03:34:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:18.924291 | orchestrator | 2026-01-06 03:34:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:18.925254 | orchestrator | 2026-01-06 03:34:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:18.925287 | orchestrator | 2026-01-06 03:34:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:21.976236 | orchestrator | 2026-01-06 03:34:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:21.978073 | orchestrator | 2026-01-06 03:34:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:21.978119 | orchestrator | 2026-01-06 03:34:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:25.040085 | orchestrator | 2026-01-06 03:34:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:25.043875 | orchestrator | 2026-01-06 03:34:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:25.043958 | orchestrator | 2026-01-06 03:34:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:28.097528 | orchestrator | 2026-01-06 03:34:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:28.099068 | orchestrator | 2026-01-06 03:34:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:28.099186 | orchestrator | 2026-01-06 03:34:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:31.147232 | orchestrator | 2026-01-06 03:34:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:31.148830 | orchestrator | 2026-01-06 03:34:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:31.149113 | orchestrator | 2026-01-06 03:34:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:34.202965 | orchestrator | 2026-01-06 03:34:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:34.204430 | orchestrator | 2026-01-06 03:34:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:34.204554 | orchestrator | 2026-01-06 03:34:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:37.255183 | orchestrator | 2026-01-06 03:34:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:37.256917 | orchestrator | 2026-01-06 03:34:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:37.256966 | orchestrator | 2026-01-06 03:34:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:40.315944 | orchestrator | 2026-01-06 03:34:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:40.318460 | orchestrator | 2026-01-06 03:34:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:40.318512 | orchestrator | 2026-01-06 03:34:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:43.371556 | orchestrator | 2026-01-06 03:34:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:43.372325 | orchestrator | 2026-01-06 03:34:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:43.372360 | orchestrator | 2026-01-06 03:34:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:46.421595 | orchestrator | 2026-01-06 03:34:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:46.423751 | orchestrator | 2026-01-06 03:34:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:46.423799 | orchestrator | 2026-01-06 03:34:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:49.473791 | orchestrator | 2026-01-06 03:34:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:49.475162 | orchestrator | 2026-01-06 03:34:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:49.475375 | orchestrator | 2026-01-06 03:34:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:52.527338 | orchestrator | 2026-01-06 03:34:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:52.529679 | orchestrator | 2026-01-06 03:34:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:52.529847 | orchestrator | 2026-01-06 03:34:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:55.570860 | orchestrator | 2026-01-06 03:34:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:55.572142 | orchestrator | 2026-01-06 03:34:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:55.572173 | orchestrator | 2026-01-06 03:34:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:34:58.628644 | orchestrator | 2026-01-06 03:34:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:34:58.629482 | orchestrator | 2026-01-06 03:34:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:34:58.629672 | orchestrator | 2026-01-06 03:34:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:01.671167 | orchestrator | 2026-01-06 03:35:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:01.673094 | orchestrator | 2026-01-06 03:35:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:01.673189 | orchestrator | 2026-01-06 03:35:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:04.718211 | orchestrator | 2026-01-06 03:35:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:04.719617 | orchestrator | 2026-01-06 03:35:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:04.719694 | orchestrator | 2026-01-06 03:35:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:07.765094 | orchestrator | 2026-01-06 03:35:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:07.766546 | orchestrator | 2026-01-06 03:35:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:07.766619 | orchestrator | 2026-01-06 03:35:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:10.815457 | orchestrator | 2026-01-06 03:35:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:10.817172 | orchestrator | 2026-01-06 03:35:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:10.817246 | orchestrator | 2026-01-06 03:35:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:13.859648 | orchestrator | 2026-01-06 03:35:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:13.860772 | orchestrator | 2026-01-06 03:35:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:13.860821 | orchestrator | 2026-01-06 03:35:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:16.905587 | orchestrator | 2026-01-06 03:35:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:16.906851 | orchestrator | 2026-01-06 03:35:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:16.906882 | orchestrator | 2026-01-06 03:35:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:19.954181 | orchestrator | 2026-01-06 03:35:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:19.955147 | orchestrator | 2026-01-06 03:35:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:19.955208 | orchestrator | 2026-01-06 03:35:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:23.007463 | orchestrator | 2026-01-06 03:35:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:23.009471 | orchestrator | 2026-01-06 03:35:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:23.009509 | orchestrator | 2026-01-06 03:35:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:26.068844 | orchestrator | 2026-01-06 03:35:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:26.071172 | orchestrator | 2026-01-06 03:35:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:26.071242 | orchestrator | 2026-01-06 03:35:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:29.122927 | orchestrator | 2026-01-06 03:35:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:29.124858 | orchestrator | 2026-01-06 03:35:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:29.125346 | orchestrator | 2026-01-06 03:35:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:32.175842 | orchestrator | 2026-01-06 03:35:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:32.177758 | orchestrator | 2026-01-06 03:35:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:32.177844 | orchestrator | 2026-01-06 03:35:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:35.237944 | orchestrator | 2026-01-06 03:35:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:35.239360 | orchestrator | 2026-01-06 03:35:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:35.239763 | orchestrator | 2026-01-06 03:35:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:38.285026 | orchestrator | 2026-01-06 03:35:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:38.287574 | orchestrator | 2026-01-06 03:35:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:38.287652 | orchestrator | 2026-01-06 03:35:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:41.331616 | orchestrator | 2026-01-06 03:35:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:41.333679 | orchestrator | 2026-01-06 03:35:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:41.333716 | orchestrator | 2026-01-06 03:35:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:44.382854 | orchestrator | 2026-01-06 03:35:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:44.383183 | orchestrator | 2026-01-06 03:35:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:44.383205 | orchestrator | 2026-01-06 03:35:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:47.435554 | orchestrator | 2026-01-06 03:35:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:47.437131 | orchestrator | 2026-01-06 03:35:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:47.437176 | orchestrator | 2026-01-06 03:35:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:50.479008 | orchestrator | 2026-01-06 03:35:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:50.480187 | orchestrator | 2026-01-06 03:35:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:50.480235 | orchestrator | 2026-01-06 03:35:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:53.533966 | orchestrator | 2026-01-06 03:35:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:53.535673 | orchestrator | 2026-01-06 03:35:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:53.535898 | orchestrator | 2026-01-06 03:35:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:56.603185 | orchestrator | 2026-01-06 03:35:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:56.603295 | orchestrator | 2026-01-06 03:35:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:56.603311 | orchestrator | 2026-01-06 03:35:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:35:59.655691 | orchestrator | 2026-01-06 03:35:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:35:59.657561 | orchestrator | 2026-01-06 03:35:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:35:59.657792 | orchestrator | 2026-01-06 03:35:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:02.707451 | orchestrator | 2026-01-06 03:36:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:02.712587 | orchestrator | 2026-01-06 03:36:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:02.712641 | orchestrator | 2026-01-06 03:36:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:05.768134 | orchestrator | 2026-01-06 03:36:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:05.774796 | orchestrator | 2026-01-06 03:36:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:05.774932 | orchestrator | 2026-01-06 03:36:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:08.824119 | orchestrator | 2026-01-06 03:36:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:08.826147 | orchestrator | 2026-01-06 03:36:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:08.826224 | orchestrator | 2026-01-06 03:36:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:11.881340 | orchestrator | 2026-01-06 03:36:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:11.883453 | orchestrator | 2026-01-06 03:36:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:11.883496 | orchestrator | 2026-01-06 03:36:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:14.942947 | orchestrator | 2026-01-06 03:36:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:14.944879 | orchestrator | 2026-01-06 03:36:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:14.944929 | orchestrator | 2026-01-06 03:36:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:17.988939 | orchestrator | 2026-01-06 03:36:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:17.989999 | orchestrator | 2026-01-06 03:36:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:17.990156 | orchestrator | 2026-01-06 03:36:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:21.047068 | orchestrator | 2026-01-06 03:36:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:21.050657 | orchestrator | 2026-01-06 03:36:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:21.050743 | orchestrator | 2026-01-06 03:36:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:24.100655 | orchestrator | 2026-01-06 03:36:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:24.102335 | orchestrator | 2026-01-06 03:36:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:24.102363 | orchestrator | 2026-01-06 03:36:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:27.149632 | orchestrator | 2026-01-06 03:36:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:27.152018 | orchestrator | 2026-01-06 03:36:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:27.152070 | orchestrator | 2026-01-06 03:36:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:30.202943 | orchestrator | 2026-01-06 03:36:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:30.204791 | orchestrator | 2026-01-06 03:36:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:30.204906 | orchestrator | 2026-01-06 03:36:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:33.247052 | orchestrator | 2026-01-06 03:36:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:33.248892 | orchestrator | 2026-01-06 03:36:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:33.248932 | orchestrator | 2026-01-06 03:36:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:36.296614 | orchestrator | 2026-01-06 03:36:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:36.298454 | orchestrator | 2026-01-06 03:36:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:36.298553 | orchestrator | 2026-01-06 03:36:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:39.340563 | orchestrator | 2026-01-06 03:36:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:39.341654 | orchestrator | 2026-01-06 03:36:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:39.341733 | orchestrator | 2026-01-06 03:36:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:42.388389 | orchestrator | 2026-01-06 03:36:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:42.390280 | orchestrator | 2026-01-06 03:36:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:42.390329 | orchestrator | 2026-01-06 03:36:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:45.435681 | orchestrator | 2026-01-06 03:36:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:45.436202 | orchestrator | 2026-01-06 03:36:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:45.436919 | orchestrator | 2026-01-06 03:36:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:48.481650 | orchestrator | 2026-01-06 03:36:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:48.482844 | orchestrator | 2026-01-06 03:36:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:48.482890 | orchestrator | 2026-01-06 03:36:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:51.529982 | orchestrator | 2026-01-06 03:36:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:51.531128 | orchestrator | 2026-01-06 03:36:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:51.531163 | orchestrator | 2026-01-06 03:36:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:54.579510 | orchestrator | 2026-01-06 03:36:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:54.581603 | orchestrator | 2026-01-06 03:36:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:54.581742 | orchestrator | 2026-01-06 03:36:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:36:57.625963 | orchestrator | 2026-01-06 03:36:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:36:57.627539 | orchestrator | 2026-01-06 03:36:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:36:57.627698 | orchestrator | 2026-01-06 03:36:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:00.674718 | orchestrator | 2026-01-06 03:37:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:00.676553 | orchestrator | 2026-01-06 03:37:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:00.676636 | orchestrator | 2026-01-06 03:37:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:03.725934 | orchestrator | 2026-01-06 03:37:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:03.727256 | orchestrator | 2026-01-06 03:37:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:03.727317 | orchestrator | 2026-01-06 03:37:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:06.776044 | orchestrator | 2026-01-06 03:37:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:06.778521 | orchestrator | 2026-01-06 03:37:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:06.778587 | orchestrator | 2026-01-06 03:37:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:09.823721 | orchestrator | 2026-01-06 03:37:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:09.825362 | orchestrator | 2026-01-06 03:37:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:09.825678 | orchestrator | 2026-01-06 03:37:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:12.873497 | orchestrator | 2026-01-06 03:37:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:12.878311 | orchestrator | 2026-01-06 03:37:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:12.878406 | orchestrator | 2026-01-06 03:37:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:15.923334 | orchestrator | 2026-01-06 03:37:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:15.924910 | orchestrator | 2026-01-06 03:37:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:15.924953 | orchestrator | 2026-01-06 03:37:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:18.969890 | orchestrator | 2026-01-06 03:37:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:18.971366 | orchestrator | 2026-01-06 03:37:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:18.971493 | orchestrator | 2026-01-06 03:37:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:22.021909 | orchestrator | 2026-01-06 03:37:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:22.024657 | orchestrator | 2026-01-06 03:37:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:22.024904 | orchestrator | 2026-01-06 03:37:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:25.070426 | orchestrator | 2026-01-06 03:37:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:25.071820 | orchestrator | 2026-01-06 03:37:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:25.071873 | orchestrator | 2026-01-06 03:37:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:28.120265 | orchestrator | 2026-01-06 03:37:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:28.122869 | orchestrator | 2026-01-06 03:37:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:28.122941 | orchestrator | 2026-01-06 03:37:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:31.171513 | orchestrator | 2026-01-06 03:37:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:31.172950 | orchestrator | 2026-01-06 03:37:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:31.173008 | orchestrator | 2026-01-06 03:37:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:34.220324 | orchestrator | 2026-01-06 03:37:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:34.221594 | orchestrator | 2026-01-06 03:37:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:34.221636 | orchestrator | 2026-01-06 03:37:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:37.266779 | orchestrator | 2026-01-06 03:37:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:37.267745 | orchestrator | 2026-01-06 03:37:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:37.267948 | orchestrator | 2026-01-06 03:37:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:40.317249 | orchestrator | 2026-01-06 03:37:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:40.320590 | orchestrator | 2026-01-06 03:37:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:40.320773 | orchestrator | 2026-01-06 03:37:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:43.369191 | orchestrator | 2026-01-06 03:37:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:43.370565 | orchestrator | 2026-01-06 03:37:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:43.370621 | orchestrator | 2026-01-06 03:37:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:46.421856 | orchestrator | 2026-01-06 03:37:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:46.423247 | orchestrator | 2026-01-06 03:37:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:46.423280 | orchestrator | 2026-01-06 03:37:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:49.470741 | orchestrator | 2026-01-06 03:37:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:49.472416 | orchestrator | 2026-01-06 03:37:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:49.472468 | orchestrator | 2026-01-06 03:37:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:52.517790 | orchestrator | 2026-01-06 03:37:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:52.519921 | orchestrator | 2026-01-06 03:37:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:52.520000 | orchestrator | 2026-01-06 03:37:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:55.564405 | orchestrator | 2026-01-06 03:37:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:55.565623 | orchestrator | 2026-01-06 03:37:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:55.565740 | orchestrator | 2026-01-06 03:37:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:37:58.605759 | orchestrator | 2026-01-06 03:37:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:37:58.606931 | orchestrator | 2026-01-06 03:37:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:37:58.606997 | orchestrator | 2026-01-06 03:37:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:01.659976 | orchestrator | 2026-01-06 03:38:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:01.660875 | orchestrator | 2026-01-06 03:38:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:01.661209 | orchestrator | 2026-01-06 03:38:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:04.710269 | orchestrator | 2026-01-06 03:38:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:04.711701 | orchestrator | 2026-01-06 03:38:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:04.711750 | orchestrator | 2026-01-06 03:38:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:07.757708 | orchestrator | 2026-01-06 03:38:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:07.761120 | orchestrator | 2026-01-06 03:38:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:07.761155 | orchestrator | 2026-01-06 03:38:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:10.810411 | orchestrator | 2026-01-06 03:38:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:10.814309 | orchestrator | 2026-01-06 03:38:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:10.814442 | orchestrator | 2026-01-06 03:38:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:13.863053 | orchestrator | 2026-01-06 03:38:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:13.864160 | orchestrator | 2026-01-06 03:38:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:13.864212 | orchestrator | 2026-01-06 03:38:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:16.909309 | orchestrator | 2026-01-06 03:38:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:16.910750 | orchestrator | 2026-01-06 03:38:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:16.910805 | orchestrator | 2026-01-06 03:38:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:19.952994 | orchestrator | 2026-01-06 03:38:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:19.954010 | orchestrator | 2026-01-06 03:38:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:19.954151 | orchestrator | 2026-01-06 03:38:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:23.001660 | orchestrator | 2026-01-06 03:38:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:23.002835 | orchestrator | 2026-01-06 03:38:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:23.003078 | orchestrator | 2026-01-06 03:38:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:26.050223 | orchestrator | 2026-01-06 03:38:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:26.052034 | orchestrator | 2026-01-06 03:38:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:26.052079 | orchestrator | 2026-01-06 03:38:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:29.095595 | orchestrator | 2026-01-06 03:38:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:29.097781 | orchestrator | 2026-01-06 03:38:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:29.097816 | orchestrator | 2026-01-06 03:38:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:32.148244 | orchestrator | 2026-01-06 03:38:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:32.149567 | orchestrator | 2026-01-06 03:38:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:32.149893 | orchestrator | 2026-01-06 03:38:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:35.194007 | orchestrator | 2026-01-06 03:38:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:35.196082 | orchestrator | 2026-01-06 03:38:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:35.196130 | orchestrator | 2026-01-06 03:38:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:38.240541 | orchestrator | 2026-01-06 03:38:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:38.242277 | orchestrator | 2026-01-06 03:38:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:38.242313 | orchestrator | 2026-01-06 03:38:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:41.286897 | orchestrator | 2026-01-06 03:38:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:41.289106 | orchestrator | 2026-01-06 03:38:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:41.289484 | orchestrator | 2026-01-06 03:38:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:44.339468 | orchestrator | 2026-01-06 03:38:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:44.340317 | orchestrator | 2026-01-06 03:38:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:44.340412 | orchestrator | 2026-01-06 03:38:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:47.388066 | orchestrator | 2026-01-06 03:38:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:47.389506 | orchestrator | 2026-01-06 03:38:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:47.389548 | orchestrator | 2026-01-06 03:38:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:50.430106 | orchestrator | 2026-01-06 03:38:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:50.430349 | orchestrator | 2026-01-06 03:38:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:50.430374 | orchestrator | 2026-01-06 03:38:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:53.477271 | orchestrator | 2026-01-06 03:38:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:53.479539 | orchestrator | 2026-01-06 03:38:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:53.479761 | orchestrator | 2026-01-06 03:38:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:56.529205 | orchestrator | 2026-01-06 03:38:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:56.531609 | orchestrator | 2026-01-06 03:38:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:56.531666 | orchestrator | 2026-01-06 03:38:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:38:59.583234 | orchestrator | 2026-01-06 03:38:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:38:59.585546 | orchestrator | 2026-01-06 03:38:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:38:59.585598 | orchestrator | 2026-01-06 03:38:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:02.633252 | orchestrator | 2026-01-06 03:39:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:02.634130 | orchestrator | 2026-01-06 03:39:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:02.634198 | orchestrator | 2026-01-06 03:39:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:05.682375 | orchestrator | 2026-01-06 03:39:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:05.683799 | orchestrator | 2026-01-06 03:39:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:05.683984 | orchestrator | 2026-01-06 03:39:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:08.732563 | orchestrator | 2026-01-06 03:39:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:08.734686 | orchestrator | 2026-01-06 03:39:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:08.734840 | orchestrator | 2026-01-06 03:39:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:11.779114 | orchestrator | 2026-01-06 03:39:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:11.780696 | orchestrator | 2026-01-06 03:39:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:11.780894 | orchestrator | 2026-01-06 03:39:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:14.832383 | orchestrator | 2026-01-06 03:39:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:14.834776 | orchestrator | 2026-01-06 03:39:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:14.834885 | orchestrator | 2026-01-06 03:39:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:17.884987 | orchestrator | 2026-01-06 03:39:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:17.886448 | orchestrator | 2026-01-06 03:39:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:17.886536 | orchestrator | 2026-01-06 03:39:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:20.933544 | orchestrator | 2026-01-06 03:39:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:20.934722 | orchestrator | 2026-01-06 03:39:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:20.934759 | orchestrator | 2026-01-06 03:39:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:23.982716 | orchestrator | 2026-01-06 03:39:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:23.984045 | orchestrator | 2026-01-06 03:39:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:23.984209 | orchestrator | 2026-01-06 03:39:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:27.039069 | orchestrator | 2026-01-06 03:39:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:27.040594 | orchestrator | 2026-01-06 03:39:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:27.040646 | orchestrator | 2026-01-06 03:39:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:30.082319 | orchestrator | 2026-01-06 03:39:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:30.084023 | orchestrator | 2026-01-06 03:39:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:30.084063 | orchestrator | 2026-01-06 03:39:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:33.131636 | orchestrator | 2026-01-06 03:39:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:33.132738 | orchestrator | 2026-01-06 03:39:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:33.132793 | orchestrator | 2026-01-06 03:39:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:36.177215 | orchestrator | 2026-01-06 03:39:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:36.179049 | orchestrator | 2026-01-06 03:39:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:36.179193 | orchestrator | 2026-01-06 03:39:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:39.222779 | orchestrator | 2026-01-06 03:39:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:39.222992 | orchestrator | 2026-01-06 03:39:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:39.223148 | orchestrator | 2026-01-06 03:39:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:42.270549 | orchestrator | 2026-01-06 03:39:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:42.272194 | orchestrator | 2026-01-06 03:39:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:42.272323 | orchestrator | 2026-01-06 03:39:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:45.319586 | orchestrator | 2026-01-06 03:39:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:45.322468 | orchestrator | 2026-01-06 03:39:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:45.322806 | orchestrator | 2026-01-06 03:39:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:48.364696 | orchestrator | 2026-01-06 03:39:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:48.367178 | orchestrator | 2026-01-06 03:39:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:48.367237 | orchestrator | 2026-01-06 03:39:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:51.411276 | orchestrator | 2026-01-06 03:39:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:51.412963 | orchestrator | 2026-01-06 03:39:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:51.413088 | orchestrator | 2026-01-06 03:39:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:54.462454 | orchestrator | 2026-01-06 03:39:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:54.463857 | orchestrator | 2026-01-06 03:39:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:54.463989 | orchestrator | 2026-01-06 03:39:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:39:57.515167 | orchestrator | 2026-01-06 03:39:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:39:57.516745 | orchestrator | 2026-01-06 03:39:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:39:57.516818 | orchestrator | 2026-01-06 03:39:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:00.564671 | orchestrator | 2026-01-06 03:40:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:00.566370 | orchestrator | 2026-01-06 03:40:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:00.566577 | orchestrator | 2026-01-06 03:40:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:03.620669 | orchestrator | 2026-01-06 03:40:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:03.622636 | orchestrator | 2026-01-06 03:40:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:03.623322 | orchestrator | 2026-01-06 03:40:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:06.670375 | orchestrator | 2026-01-06 03:40:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:06.674780 | orchestrator | 2026-01-06 03:40:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:06.674866 | orchestrator | 2026-01-06 03:40:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:09.720379 | orchestrator | 2026-01-06 03:40:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:09.721797 | orchestrator | 2026-01-06 03:40:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:09.721825 | orchestrator | 2026-01-06 03:40:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:12.767513 | orchestrator | 2026-01-06 03:40:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:12.769608 | orchestrator | 2026-01-06 03:40:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:12.769639 | orchestrator | 2026-01-06 03:40:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:15.816038 | orchestrator | 2026-01-06 03:40:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:15.817692 | orchestrator | 2026-01-06 03:40:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:15.818098 | orchestrator | 2026-01-06 03:40:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:18.864220 | orchestrator | 2026-01-06 03:40:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:18.866130 | orchestrator | 2026-01-06 03:40:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:18.866190 | orchestrator | 2026-01-06 03:40:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:21.916041 | orchestrator | 2026-01-06 03:40:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:21.918373 | orchestrator | 2026-01-06 03:40:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:21.918411 | orchestrator | 2026-01-06 03:40:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:24.960660 | orchestrator | 2026-01-06 03:40:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:24.962826 | orchestrator | 2026-01-06 03:40:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:24.962870 | orchestrator | 2026-01-06 03:40:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:28.006999 | orchestrator | 2026-01-06 03:40:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:28.008993 | orchestrator | 2026-01-06 03:40:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:28.009071 | orchestrator | 2026-01-06 03:40:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:31.054205 | orchestrator | 2026-01-06 03:40:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:31.055934 | orchestrator | 2026-01-06 03:40:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:31.055965 | orchestrator | 2026-01-06 03:40:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:34.100594 | orchestrator | 2026-01-06 03:40:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:34.102139 | orchestrator | 2026-01-06 03:40:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:34.102185 | orchestrator | 2026-01-06 03:40:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:37.152702 | orchestrator | 2026-01-06 03:40:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:37.153960 | orchestrator | 2026-01-06 03:40:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:37.154000 | orchestrator | 2026-01-06 03:40:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:40.198560 | orchestrator | 2026-01-06 03:40:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:40.199946 | orchestrator | 2026-01-06 03:40:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:40.199972 | orchestrator | 2026-01-06 03:40:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:43.255364 | orchestrator | 2026-01-06 03:40:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:43.259845 | orchestrator | 2026-01-06 03:40:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:43.259944 | orchestrator | 2026-01-06 03:40:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:46.299289 | orchestrator | 2026-01-06 03:40:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:46.299989 | orchestrator | 2026-01-06 03:40:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:46.300047 | orchestrator | 2026-01-06 03:40:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:49.345144 | orchestrator | 2026-01-06 03:40:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:49.346714 | orchestrator | 2026-01-06 03:40:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:49.346856 | orchestrator | 2026-01-06 03:40:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:52.389872 | orchestrator | 2026-01-06 03:40:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:52.390391 | orchestrator | 2026-01-06 03:40:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:52.390428 | orchestrator | 2026-01-06 03:40:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:55.437833 | orchestrator | 2026-01-06 03:40:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:55.439337 | orchestrator | 2026-01-06 03:40:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:55.439489 | orchestrator | 2026-01-06 03:40:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:40:58.489327 | orchestrator | 2026-01-06 03:40:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:40:58.491547 | orchestrator | 2026-01-06 03:40:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:40:58.491603 | orchestrator | 2026-01-06 03:40:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:01.544529 | orchestrator | 2026-01-06 03:41:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:01.546419 | orchestrator | 2026-01-06 03:41:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:01.546640 | orchestrator | 2026-01-06 03:41:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:04.596255 | orchestrator | 2026-01-06 03:41:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:04.598263 | orchestrator | 2026-01-06 03:41:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:04.598309 | orchestrator | 2026-01-06 03:41:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:07.649064 | orchestrator | 2026-01-06 03:41:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:07.651111 | orchestrator | 2026-01-06 03:41:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:07.651178 | orchestrator | 2026-01-06 03:41:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:10.701757 | orchestrator | 2026-01-06 03:41:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:10.703509 | orchestrator | 2026-01-06 03:41:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:10.703570 | orchestrator | 2026-01-06 03:41:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:13.748240 | orchestrator | 2026-01-06 03:41:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:13.750383 | orchestrator | 2026-01-06 03:41:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:13.750516 | orchestrator | 2026-01-06 03:41:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:16.798634 | orchestrator | 2026-01-06 03:41:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:16.801032 | orchestrator | 2026-01-06 03:41:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:16.801185 | orchestrator | 2026-01-06 03:41:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:19.850349 | orchestrator | 2026-01-06 03:41:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:19.851817 | orchestrator | 2026-01-06 03:41:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:19.851998 | orchestrator | 2026-01-06 03:41:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:22.902736 | orchestrator | 2026-01-06 03:41:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:22.904233 | orchestrator | 2026-01-06 03:41:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:22.904282 | orchestrator | 2026-01-06 03:41:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:25.949220 | orchestrator | 2026-01-06 03:41:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:25.950473 | orchestrator | 2026-01-06 03:41:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:25.950600 | orchestrator | 2026-01-06 03:41:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:28.995587 | orchestrator | 2026-01-06 03:41:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:28.998291 | orchestrator | 2026-01-06 03:41:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:28.998520 | orchestrator | 2026-01-06 03:41:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:32.047532 | orchestrator | 2026-01-06 03:41:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:32.049210 | orchestrator | 2026-01-06 03:41:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:32.049232 | orchestrator | 2026-01-06 03:41:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:35.091820 | orchestrator | 2026-01-06 03:41:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:35.093503 | orchestrator | 2026-01-06 03:41:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:35.093599 | orchestrator | 2026-01-06 03:41:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:38.138005 | orchestrator | 2026-01-06 03:41:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:38.139483 | orchestrator | 2026-01-06 03:41:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:38.140163 | orchestrator | 2026-01-06 03:41:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:41.182631 | orchestrator | 2026-01-06 03:41:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:41.184483 | orchestrator | 2026-01-06 03:41:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:41.184519 | orchestrator | 2026-01-06 03:41:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:44.231510 | orchestrator | 2026-01-06 03:41:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:44.233613 | orchestrator | 2026-01-06 03:41:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:44.233666 | orchestrator | 2026-01-06 03:41:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:47.275352 | orchestrator | 2026-01-06 03:41:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:47.276820 | orchestrator | 2026-01-06 03:41:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:47.276874 | orchestrator | 2026-01-06 03:41:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:50.319812 | orchestrator | 2026-01-06 03:41:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:50.321929 | orchestrator | 2026-01-06 03:41:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:50.322155 | orchestrator | 2026-01-06 03:41:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:53.373509 | orchestrator | 2026-01-06 03:41:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:53.376187 | orchestrator | 2026-01-06 03:41:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:53.376233 | orchestrator | 2026-01-06 03:41:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:56.418358 | orchestrator | 2026-01-06 03:41:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:56.420385 | orchestrator | 2026-01-06 03:41:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:56.420472 | orchestrator | 2026-01-06 03:41:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:41:59.470348 | orchestrator | 2026-01-06 03:41:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:41:59.472047 | orchestrator | 2026-01-06 03:41:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:41:59.472086 | orchestrator | 2026-01-06 03:41:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:02.523228 | orchestrator | 2026-01-06 03:42:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:02.524807 | orchestrator | 2026-01-06 03:42:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:02.524918 | orchestrator | 2026-01-06 03:42:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:05.573249 | orchestrator | 2026-01-06 03:42:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:05.574339 | orchestrator | 2026-01-06 03:42:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:05.574436 | orchestrator | 2026-01-06 03:42:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:08.624471 | orchestrator | 2026-01-06 03:42:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:08.626187 | orchestrator | 2026-01-06 03:42:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:08.627204 | orchestrator | 2026-01-06 03:42:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:11.676073 | orchestrator | 2026-01-06 03:42:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:11.677795 | orchestrator | 2026-01-06 03:42:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:11.678145 | orchestrator | 2026-01-06 03:42:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:14.727663 | orchestrator | 2026-01-06 03:42:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:14.729554 | orchestrator | 2026-01-06 03:42:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:14.729629 | orchestrator | 2026-01-06 03:42:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:17.775864 | orchestrator | 2026-01-06 03:42:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:17.778377 | orchestrator | 2026-01-06 03:42:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:17.778437 | orchestrator | 2026-01-06 03:42:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:20.828078 | orchestrator | 2026-01-06 03:42:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:20.829796 | orchestrator | 2026-01-06 03:42:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:20.829849 | orchestrator | 2026-01-06 03:42:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:23.879135 | orchestrator | 2026-01-06 03:42:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:23.880432 | orchestrator | 2026-01-06 03:42:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:23.880608 | orchestrator | 2026-01-06 03:42:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:26.929074 | orchestrator | 2026-01-06 03:42:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:26.930902 | orchestrator | 2026-01-06 03:42:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:26.931059 | orchestrator | 2026-01-06 03:42:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:29.981578 | orchestrator | 2026-01-06 03:42:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:29.984094 | orchestrator | 2026-01-06 03:42:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:29.984236 | orchestrator | 2026-01-06 03:42:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:33.024627 | orchestrator | 2026-01-06 03:42:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:33.026287 | orchestrator | 2026-01-06 03:42:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:33.026361 | orchestrator | 2026-01-06 03:42:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:36.069455 | orchestrator | 2026-01-06 03:42:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:36.069545 | orchestrator | 2026-01-06 03:42:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:36.069601 | orchestrator | 2026-01-06 03:42:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:39.123584 | orchestrator | 2026-01-06 03:42:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:39.125287 | orchestrator | 2026-01-06 03:42:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:39.125303 | orchestrator | 2026-01-06 03:42:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:42.174441 | orchestrator | 2026-01-06 03:42:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:42.176805 | orchestrator | 2026-01-06 03:42:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:42.176874 | orchestrator | 2026-01-06 03:42:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:45.221945 | orchestrator | 2026-01-06 03:42:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:45.223243 | orchestrator | 2026-01-06 03:42:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:45.223396 | orchestrator | 2026-01-06 03:42:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:48.272867 | orchestrator | 2026-01-06 03:42:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:48.274842 | orchestrator | 2026-01-06 03:42:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:48.274899 | orchestrator | 2026-01-06 03:42:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:51.313304 | orchestrator | 2026-01-06 03:42:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:51.314755 | orchestrator | 2026-01-06 03:42:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:51.314776 | orchestrator | 2026-01-06 03:42:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:54.363476 | orchestrator | 2026-01-06 03:42:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:54.364463 | orchestrator | 2026-01-06 03:42:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:54.364544 | orchestrator | 2026-01-06 03:42:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:42:57.416103 | orchestrator | 2026-01-06 03:42:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:42:57.418616 | orchestrator | 2026-01-06 03:42:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:42:57.418677 | orchestrator | 2026-01-06 03:42:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:00.466809 | orchestrator | 2026-01-06 03:43:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:00.468672 | orchestrator | 2026-01-06 03:43:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:00.468707 | orchestrator | 2026-01-06 03:43:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:03.520053 | orchestrator | 2026-01-06 03:43:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:03.522310 | orchestrator | 2026-01-06 03:43:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:03.522374 | orchestrator | 2026-01-06 03:43:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:06.570328 | orchestrator | 2026-01-06 03:43:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:06.573541 | orchestrator | 2026-01-06 03:43:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:06.573579 | orchestrator | 2026-01-06 03:43:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:09.624393 | orchestrator | 2026-01-06 03:43:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:09.626242 | orchestrator | 2026-01-06 03:43:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:09.626298 | orchestrator | 2026-01-06 03:43:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:12.677311 | orchestrator | 2026-01-06 03:43:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:12.678745 | orchestrator | 2026-01-06 03:43:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:12.678792 | orchestrator | 2026-01-06 03:43:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:15.727962 | orchestrator | 2026-01-06 03:43:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:15.730131 | orchestrator | 2026-01-06 03:43:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:15.730186 | orchestrator | 2026-01-06 03:43:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:18.780488 | orchestrator | 2026-01-06 03:43:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:18.781540 | orchestrator | 2026-01-06 03:43:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:18.781595 | orchestrator | 2026-01-06 03:43:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:21.830384 | orchestrator | 2026-01-06 03:43:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:21.831658 | orchestrator | 2026-01-06 03:43:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:21.831787 | orchestrator | 2026-01-06 03:43:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:24.887377 | orchestrator | 2026-01-06 03:43:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:24.888569 | orchestrator | 2026-01-06 03:43:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:24.888618 | orchestrator | 2026-01-06 03:43:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:27.940136 | orchestrator | 2026-01-06 03:43:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:27.942533 | orchestrator | 2026-01-06 03:43:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:27.942661 | orchestrator | 2026-01-06 03:43:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:30.991405 | orchestrator | 2026-01-06 03:43:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:30.992781 | orchestrator | 2026-01-06 03:43:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:30.992826 | orchestrator | 2026-01-06 03:43:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:34.057358 | orchestrator | 2026-01-06 03:43:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:34.057430 | orchestrator | 2026-01-06 03:43:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:34.057436 | orchestrator | 2026-01-06 03:43:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:37.107278 | orchestrator | 2026-01-06 03:43:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:37.108277 | orchestrator | 2026-01-06 03:43:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:37.108321 | orchestrator | 2026-01-06 03:43:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:40.155466 | orchestrator | 2026-01-06 03:43:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:40.157286 | orchestrator | 2026-01-06 03:43:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:40.157448 | orchestrator | 2026-01-06 03:43:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:43.208055 | orchestrator | 2026-01-06 03:43:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:43.209552 | orchestrator | 2026-01-06 03:43:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:43.209611 | orchestrator | 2026-01-06 03:43:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:46.251628 | orchestrator | 2026-01-06 03:43:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:46.253273 | orchestrator | 2026-01-06 03:43:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:46.253321 | orchestrator | 2026-01-06 03:43:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:49.301092 | orchestrator | 2026-01-06 03:43:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:49.302860 | orchestrator | 2026-01-06 03:43:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:49.302942 | orchestrator | 2026-01-06 03:43:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:52.353885 | orchestrator | 2026-01-06 03:43:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:52.356358 | orchestrator | 2026-01-06 03:43:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:52.356430 | orchestrator | 2026-01-06 03:43:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:55.398173 | orchestrator | 2026-01-06 03:43:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:55.399663 | orchestrator | 2026-01-06 03:43:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:55.399730 | orchestrator | 2026-01-06 03:43:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:43:58.444771 | orchestrator | 2026-01-06 03:43:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:43:58.445874 | orchestrator | 2026-01-06 03:43:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:43:58.445956 | orchestrator | 2026-01-06 03:43:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:01.495275 | orchestrator | 2026-01-06 03:44:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:01.496895 | orchestrator | 2026-01-06 03:44:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:01.496943 | orchestrator | 2026-01-06 03:44:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:04.545563 | orchestrator | 2026-01-06 03:44:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:04.547494 | orchestrator | 2026-01-06 03:44:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:04.547570 | orchestrator | 2026-01-06 03:44:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:07.600228 | orchestrator | 2026-01-06 03:44:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:07.601308 | orchestrator | 2026-01-06 03:44:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:07.601342 | orchestrator | 2026-01-06 03:44:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:10.652117 | orchestrator | 2026-01-06 03:44:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:10.653455 | orchestrator | 2026-01-06 03:44:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:10.653523 | orchestrator | 2026-01-06 03:44:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:13.697231 | orchestrator | 2026-01-06 03:44:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:13.698762 | orchestrator | 2026-01-06 03:44:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:13.699126 | orchestrator | 2026-01-06 03:44:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:16.746989 | orchestrator | 2026-01-06 03:44:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:16.748205 | orchestrator | 2026-01-06 03:44:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:16.748456 | orchestrator | 2026-01-06 03:44:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:19.799806 | orchestrator | 2026-01-06 03:44:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:19.800805 | orchestrator | 2026-01-06 03:44:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:19.800862 | orchestrator | 2026-01-06 03:44:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:22.849172 | orchestrator | 2026-01-06 03:44:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:22.850267 | orchestrator | 2026-01-06 03:44:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:22.850369 | orchestrator | 2026-01-06 03:44:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:25.897512 | orchestrator | 2026-01-06 03:44:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:25.898752 | orchestrator | 2026-01-06 03:44:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:25.898793 | orchestrator | 2026-01-06 03:44:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:28.949620 | orchestrator | 2026-01-06 03:44:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:28.950664 | orchestrator | 2026-01-06 03:44:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:28.950714 | orchestrator | 2026-01-06 03:44:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:31.992497 | orchestrator | 2026-01-06 03:44:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:31.994614 | orchestrator | 2026-01-06 03:44:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:31.994699 | orchestrator | 2026-01-06 03:44:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:35.041359 | orchestrator | 2026-01-06 03:44:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:35.044572 | orchestrator | 2026-01-06 03:44:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:35.044655 | orchestrator | 2026-01-06 03:44:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:38.098307 | orchestrator | 2026-01-06 03:44:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:38.099865 | orchestrator | 2026-01-06 03:44:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:38.099907 | orchestrator | 2026-01-06 03:44:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:41.146278 | orchestrator | 2026-01-06 03:44:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:41.148583 | orchestrator | 2026-01-06 03:44:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:41.148670 | orchestrator | 2026-01-06 03:44:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:44.201727 | orchestrator | 2026-01-06 03:44:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:44.202770 | orchestrator | 2026-01-06 03:44:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:44.203442 | orchestrator | 2026-01-06 03:44:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:47.250840 | orchestrator | 2026-01-06 03:44:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:47.252503 | orchestrator | 2026-01-06 03:44:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:47.252526 | orchestrator | 2026-01-06 03:44:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:50.305307 | orchestrator | 2026-01-06 03:44:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:50.308553 | orchestrator | 2026-01-06 03:44:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:50.308644 | orchestrator | 2026-01-06 03:44:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:53.363386 | orchestrator | 2026-01-06 03:44:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:53.365452 | orchestrator | 2026-01-06 03:44:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:53.365468 | orchestrator | 2026-01-06 03:44:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:56.416757 | orchestrator | 2026-01-06 03:44:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:56.418329 | orchestrator | 2026-01-06 03:44:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:56.418375 | orchestrator | 2026-01-06 03:44:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:44:59.475827 | orchestrator | 2026-01-06 03:44:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:44:59.477639 | orchestrator | 2026-01-06 03:44:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:44:59.477743 | orchestrator | 2026-01-06 03:44:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:02.531808 | orchestrator | 2026-01-06 03:45:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:02.533719 | orchestrator | 2026-01-06 03:45:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:02.533760 | orchestrator | 2026-01-06 03:45:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:05.585760 | orchestrator | 2026-01-06 03:45:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:05.587554 | orchestrator | 2026-01-06 03:45:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:05.587643 | orchestrator | 2026-01-06 03:45:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:08.640153 | orchestrator | 2026-01-06 03:45:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:08.641926 | orchestrator | 2026-01-06 03:45:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:08.642178 | orchestrator | 2026-01-06 03:45:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:11.697328 | orchestrator | 2026-01-06 03:45:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:11.700662 | orchestrator | 2026-01-06 03:45:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:11.700934 | orchestrator | 2026-01-06 03:45:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:14.758446 | orchestrator | 2026-01-06 03:45:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:14.759930 | orchestrator | 2026-01-06 03:45:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:14.759968 | orchestrator | 2026-01-06 03:45:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:17.806773 | orchestrator | 2026-01-06 03:45:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:17.806873 | orchestrator | 2026-01-06 03:45:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:17.806883 | orchestrator | 2026-01-06 03:45:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:20.860294 | orchestrator | 2026-01-06 03:45:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:20.862800 | orchestrator | 2026-01-06 03:45:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:20.862914 | orchestrator | 2026-01-06 03:45:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:23.916837 | orchestrator | 2026-01-06 03:45:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:23.918731 | orchestrator | 2026-01-06 03:45:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:23.918768 | orchestrator | 2026-01-06 03:45:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:26.972778 | orchestrator | 2026-01-06 03:45:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:26.975324 | orchestrator | 2026-01-06 03:45:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:26.975372 | orchestrator | 2026-01-06 03:45:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:30.031846 | orchestrator | 2026-01-06 03:45:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:30.033943 | orchestrator | 2026-01-06 03:45:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:30.033968 | orchestrator | 2026-01-06 03:45:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:33.087139 | orchestrator | 2026-01-06 03:45:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:33.089954 | orchestrator | 2026-01-06 03:45:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:33.090073 | orchestrator | 2026-01-06 03:45:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:36.145628 | orchestrator | 2026-01-06 03:45:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:36.146750 | orchestrator | 2026-01-06 03:45:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:36.146782 | orchestrator | 2026-01-06 03:45:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:39.194610 | orchestrator | 2026-01-06 03:45:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:39.197862 | orchestrator | 2026-01-06 03:45:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:39.198074 | orchestrator | 2026-01-06 03:45:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:42.245658 | orchestrator | 2026-01-06 03:45:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:42.247249 | orchestrator | 2026-01-06 03:45:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:42.247316 | orchestrator | 2026-01-06 03:45:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:45.302234 | orchestrator | 2026-01-06 03:45:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:45.305223 | orchestrator | 2026-01-06 03:45:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:45.305345 | orchestrator | 2026-01-06 03:45:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:48.353844 | orchestrator | 2026-01-06 03:45:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:48.355141 | orchestrator | 2026-01-06 03:45:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:48.355293 | orchestrator | 2026-01-06 03:45:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:51.406493 | orchestrator | 2026-01-06 03:45:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:51.410365 | orchestrator | 2026-01-06 03:45:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:51.410484 | orchestrator | 2026-01-06 03:45:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:54.460770 | orchestrator | 2026-01-06 03:45:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:54.461889 | orchestrator | 2026-01-06 03:45:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:54.461994 | orchestrator | 2026-01-06 03:45:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:45:57.514251 | orchestrator | 2026-01-06 03:45:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:45:57.515393 | orchestrator | 2026-01-06 03:45:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:45:57.515452 | orchestrator | 2026-01-06 03:45:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:00.565854 | orchestrator | 2026-01-06 03:46:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:00.567215 | orchestrator | 2026-01-06 03:46:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:00.567273 | orchestrator | 2026-01-06 03:46:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:03.616878 | orchestrator | 2026-01-06 03:46:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:03.618531 | orchestrator | 2026-01-06 03:46:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:03.618591 | orchestrator | 2026-01-06 03:46:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:06.665199 | orchestrator | 2026-01-06 03:46:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:06.666868 | orchestrator | 2026-01-06 03:46:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:06.666962 | orchestrator | 2026-01-06 03:46:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:09.711728 | orchestrator | 2026-01-06 03:46:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:09.713534 | orchestrator | 2026-01-06 03:46:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:09.713588 | orchestrator | 2026-01-06 03:46:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:12.767283 | orchestrator | 2026-01-06 03:46:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:12.772060 | orchestrator | 2026-01-06 03:46:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:12.772350 | orchestrator | 2026-01-06 03:46:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:15.824029 | orchestrator | 2026-01-06 03:46:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:15.825873 | orchestrator | 2026-01-06 03:46:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:15.825930 | orchestrator | 2026-01-06 03:46:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:18.873263 | orchestrator | 2026-01-06 03:46:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:18.874527 | orchestrator | 2026-01-06 03:46:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:18.874572 | orchestrator | 2026-01-06 03:46:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:21.926762 | orchestrator | 2026-01-06 03:46:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:21.928448 | orchestrator | 2026-01-06 03:46:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:21.928533 | orchestrator | 2026-01-06 03:46:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:24.977529 | orchestrator | 2026-01-06 03:46:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:24.979531 | orchestrator | 2026-01-06 03:46:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:24.979575 | orchestrator | 2026-01-06 03:46:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:28.030543 | orchestrator | 2026-01-06 03:46:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:28.031665 | orchestrator | 2026-01-06 03:46:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:28.031701 | orchestrator | 2026-01-06 03:46:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:31.079222 | orchestrator | 2026-01-06 03:46:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:31.080323 | orchestrator | 2026-01-06 03:46:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:31.080399 | orchestrator | 2026-01-06 03:46:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:34.124570 | orchestrator | 2026-01-06 03:46:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:34.126004 | orchestrator | 2026-01-06 03:46:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:34.126313 | orchestrator | 2026-01-06 03:46:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:37.175648 | orchestrator | 2026-01-06 03:46:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:37.177700 | orchestrator | 2026-01-06 03:46:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:37.177746 | orchestrator | 2026-01-06 03:46:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:40.224341 | orchestrator | 2026-01-06 03:46:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:40.225749 | orchestrator | 2026-01-06 03:46:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:40.225835 | orchestrator | 2026-01-06 03:46:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:43.278297 | orchestrator | 2026-01-06 03:46:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:43.280227 | orchestrator | 2026-01-06 03:46:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:43.280355 | orchestrator | 2026-01-06 03:46:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:46.320364 | orchestrator | 2026-01-06 03:46:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:46.321982 | orchestrator | 2026-01-06 03:46:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:46.322085 | orchestrator | 2026-01-06 03:46:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:49.365247 | orchestrator | 2026-01-06 03:46:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:49.366383 | orchestrator | 2026-01-06 03:46:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:49.366429 | orchestrator | 2026-01-06 03:46:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:52.414613 | orchestrator | 2026-01-06 03:46:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:52.415678 | orchestrator | 2026-01-06 03:46:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:52.415768 | orchestrator | 2026-01-06 03:46:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:55.457980 | orchestrator | 2026-01-06 03:46:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:55.460070 | orchestrator | 2026-01-06 03:46:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:55.460093 | orchestrator | 2026-01-06 03:46:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:46:58.505819 | orchestrator | 2026-01-06 03:46:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:46:58.507978 | orchestrator | 2026-01-06 03:46:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:46:58.508192 | orchestrator | 2026-01-06 03:46:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:01.562284 | orchestrator | 2026-01-06 03:47:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:01.564011 | orchestrator | 2026-01-06 03:47:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:01.564030 | orchestrator | 2026-01-06 03:47:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:04.607095 | orchestrator | 2026-01-06 03:47:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:04.608522 | orchestrator | 2026-01-06 03:47:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:04.608573 | orchestrator | 2026-01-06 03:47:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:07.656699 | orchestrator | 2026-01-06 03:47:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:07.657865 | orchestrator | 2026-01-06 03:47:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:07.657955 | orchestrator | 2026-01-06 03:47:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:10.703699 | orchestrator | 2026-01-06 03:47:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:10.705763 | orchestrator | 2026-01-06 03:47:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:10.705812 | orchestrator | 2026-01-06 03:47:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:13.751269 | orchestrator | 2026-01-06 03:47:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:13.753326 | orchestrator | 2026-01-06 03:47:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:13.753396 | orchestrator | 2026-01-06 03:47:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:16.798188 | orchestrator | 2026-01-06 03:47:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:16.800481 | orchestrator | 2026-01-06 03:47:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:16.800533 | orchestrator | 2026-01-06 03:47:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:19.845786 | orchestrator | 2026-01-06 03:47:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:19.847548 | orchestrator | 2026-01-06 03:47:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:19.847692 | orchestrator | 2026-01-06 03:47:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:22.893961 | orchestrator | 2026-01-06 03:47:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:22.896126 | orchestrator | 2026-01-06 03:47:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:22.896253 | orchestrator | 2026-01-06 03:47:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:25.944805 | orchestrator | 2026-01-06 03:47:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:25.946728 | orchestrator | 2026-01-06 03:47:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:25.946803 | orchestrator | 2026-01-06 03:47:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:28.994823 | orchestrator | 2026-01-06 03:47:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:28.996504 | orchestrator | 2026-01-06 03:47:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:28.996558 | orchestrator | 2026-01-06 03:47:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:32.043406 | orchestrator | 2026-01-06 03:47:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:32.046758 | orchestrator | 2026-01-06 03:47:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:32.046830 | orchestrator | 2026-01-06 03:47:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:35.085504 | orchestrator | 2026-01-06 03:47:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:35.086943 | orchestrator | 2026-01-06 03:47:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:35.087001 | orchestrator | 2026-01-06 03:47:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:38.133472 | orchestrator | 2026-01-06 03:47:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:38.134764 | orchestrator | 2026-01-06 03:47:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:38.134860 | orchestrator | 2026-01-06 03:47:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:41.180495 | orchestrator | 2026-01-06 03:47:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:41.181780 | orchestrator | 2026-01-06 03:47:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:41.181818 | orchestrator | 2026-01-06 03:47:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:44.225755 | orchestrator | 2026-01-06 03:47:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:44.227591 | orchestrator | 2026-01-06 03:47:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:44.228255 | orchestrator | 2026-01-06 03:47:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:47.272095 | orchestrator | 2026-01-06 03:47:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:47.274458 | orchestrator | 2026-01-06 03:47:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:47.274834 | orchestrator | 2026-01-06 03:47:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:50.314246 | orchestrator | 2026-01-06 03:47:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:50.315414 | orchestrator | 2026-01-06 03:47:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:50.315539 | orchestrator | 2026-01-06 03:47:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:53.362581 | orchestrator | 2026-01-06 03:47:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:53.363913 | orchestrator | 2026-01-06 03:47:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:53.363992 | orchestrator | 2026-01-06 03:47:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:56.410274 | orchestrator | 2026-01-06 03:47:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:56.411352 | orchestrator | 2026-01-06 03:47:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:56.411388 | orchestrator | 2026-01-06 03:47:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:47:59.463670 | orchestrator | 2026-01-06 03:47:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:47:59.465823 | orchestrator | 2026-01-06 03:47:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:47:59.465865 | orchestrator | 2026-01-06 03:47:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:02.513364 | orchestrator | 2026-01-06 03:48:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:02.515802 | orchestrator | 2026-01-06 03:48:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:02.515849 | orchestrator | 2026-01-06 03:48:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:05.560163 | orchestrator | 2026-01-06 03:48:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:05.561365 | orchestrator | 2026-01-06 03:48:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:05.561493 | orchestrator | 2026-01-06 03:48:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:08.611034 | orchestrator | 2026-01-06 03:48:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:08.611932 | orchestrator | 2026-01-06 03:48:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:08.612163 | orchestrator | 2026-01-06 03:48:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:11.660234 | orchestrator | 2026-01-06 03:48:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:11.662471 | orchestrator | 2026-01-06 03:48:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:11.662514 | orchestrator | 2026-01-06 03:48:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:14.711689 | orchestrator | 2026-01-06 03:48:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:14.714482 | orchestrator | 2026-01-06 03:48:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:14.714534 | orchestrator | 2026-01-06 03:48:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:17.767794 | orchestrator | 2026-01-06 03:48:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:17.770336 | orchestrator | 2026-01-06 03:48:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:17.770536 | orchestrator | 2026-01-06 03:48:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:20.819345 | orchestrator | 2026-01-06 03:48:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:20.820715 | orchestrator | 2026-01-06 03:48:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:20.821116 | orchestrator | 2026-01-06 03:48:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:23.876860 | orchestrator | 2026-01-06 03:48:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:23.878085 | orchestrator | 2026-01-06 03:48:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:23.878120 | orchestrator | 2026-01-06 03:48:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:26.934599 | orchestrator | 2026-01-06 03:48:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:26.936088 | orchestrator | 2026-01-06 03:48:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:26.936164 | orchestrator | 2026-01-06 03:48:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:29.992142 | orchestrator | 2026-01-06 03:48:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:29.993133 | orchestrator | 2026-01-06 03:48:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:29.993174 | orchestrator | 2026-01-06 03:48:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:33.048346 | orchestrator | 2026-01-06 03:48:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:33.050646 | orchestrator | 2026-01-06 03:48:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:33.050773 | orchestrator | 2026-01-06 03:48:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:36.099506 | orchestrator | 2026-01-06 03:48:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:36.101278 | orchestrator | 2026-01-06 03:48:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:36.101322 | orchestrator | 2026-01-06 03:48:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:39.152938 | orchestrator | 2026-01-06 03:48:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:39.154725 | orchestrator | 2026-01-06 03:48:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:39.154826 | orchestrator | 2026-01-06 03:48:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:42.204922 | orchestrator | 2026-01-06 03:48:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:42.207665 | orchestrator | 2026-01-06 03:48:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:42.207725 | orchestrator | 2026-01-06 03:48:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:45.254355 | orchestrator | 2026-01-06 03:48:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:45.255191 | orchestrator | 2026-01-06 03:48:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:45.255281 | orchestrator | 2026-01-06 03:48:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:48.315231 | orchestrator | 2026-01-06 03:48:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:48.316961 | orchestrator | 2026-01-06 03:48:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:48.317138 | orchestrator | 2026-01-06 03:48:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:51.369351 | orchestrator | 2026-01-06 03:48:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:51.371094 | orchestrator | 2026-01-06 03:48:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:51.371326 | orchestrator | 2026-01-06 03:48:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:54.424734 | orchestrator | 2026-01-06 03:48:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:54.425654 | orchestrator | 2026-01-06 03:48:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:54.425810 | orchestrator | 2026-01-06 03:48:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:48:57.475780 | orchestrator | 2026-01-06 03:48:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:48:57.478693 | orchestrator | 2026-01-06 03:48:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:48:57.478755 | orchestrator | 2026-01-06 03:48:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:00.526634 | orchestrator | 2026-01-06 03:49:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:00.529239 | orchestrator | 2026-01-06 03:49:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:00.529360 | orchestrator | 2026-01-06 03:49:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:03.583774 | orchestrator | 2026-01-06 03:49:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:03.586880 | orchestrator | 2026-01-06 03:49:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:03.586965 | orchestrator | 2026-01-06 03:49:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:06.633723 | orchestrator | 2026-01-06 03:49:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:06.635251 | orchestrator | 2026-01-06 03:49:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:06.635285 | orchestrator | 2026-01-06 03:49:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:09.686833 | orchestrator | 2026-01-06 03:49:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:09.688568 | orchestrator | 2026-01-06 03:49:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:09.688644 | orchestrator | 2026-01-06 03:49:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:12.738519 | orchestrator | 2026-01-06 03:49:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:12.740165 | orchestrator | 2026-01-06 03:49:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:12.740214 | orchestrator | 2026-01-06 03:49:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:15.790766 | orchestrator | 2026-01-06 03:49:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:15.792439 | orchestrator | 2026-01-06 03:49:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:15.792472 | orchestrator | 2026-01-06 03:49:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:18.842968 | orchestrator | 2026-01-06 03:49:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:18.845189 | orchestrator | 2026-01-06 03:49:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:18.845299 | orchestrator | 2026-01-06 03:49:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:21.895137 | orchestrator | 2026-01-06 03:49:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:21.896518 | orchestrator | 2026-01-06 03:49:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:21.896606 | orchestrator | 2026-01-06 03:49:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:24.946139 | orchestrator | 2026-01-06 03:49:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:24.947439 | orchestrator | 2026-01-06 03:49:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:24.947581 | orchestrator | 2026-01-06 03:49:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:27.994808 | orchestrator | 2026-01-06 03:49:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:27.996491 | orchestrator | 2026-01-06 03:49:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:27.996538 | orchestrator | 2026-01-06 03:49:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:31.052603 | orchestrator | 2026-01-06 03:49:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:31.054539 | orchestrator | 2026-01-06 03:49:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:31.054623 | orchestrator | 2026-01-06 03:49:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:34.105917 | orchestrator | 2026-01-06 03:49:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:34.107158 | orchestrator | 2026-01-06 03:49:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:34.107200 | orchestrator | 2026-01-06 03:49:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:37.163950 | orchestrator | 2026-01-06 03:49:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:37.165161 | orchestrator | 2026-01-06 03:49:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:37.165276 | orchestrator | 2026-01-06 03:49:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:40.212063 | orchestrator | 2026-01-06 03:49:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:40.214693 | orchestrator | 2026-01-06 03:49:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:40.216388 | orchestrator | 2026-01-06 03:49:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:43.267730 | orchestrator | 2026-01-06 03:49:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:43.272650 | orchestrator | 2026-01-06 03:49:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:43.272716 | orchestrator | 2026-01-06 03:49:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:46.317677 | orchestrator | 2026-01-06 03:49:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:46.319369 | orchestrator | 2026-01-06 03:49:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:46.319484 | orchestrator | 2026-01-06 03:49:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:49.371657 | orchestrator | 2026-01-06 03:49:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:49.373205 | orchestrator | 2026-01-06 03:49:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:49.373270 | orchestrator | 2026-01-06 03:49:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:52.423565 | orchestrator | 2026-01-06 03:49:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:52.425437 | orchestrator | 2026-01-06 03:49:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:52.425567 | orchestrator | 2026-01-06 03:49:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:55.471422 | orchestrator | 2026-01-06 03:49:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:55.472809 | orchestrator | 2026-01-06 03:49:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:55.473020 | orchestrator | 2026-01-06 03:49:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:49:58.518900 | orchestrator | 2026-01-06 03:49:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:49:58.520715 | orchestrator | 2026-01-06 03:49:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:49:58.520757 | orchestrator | 2026-01-06 03:49:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:01.568694 | orchestrator | 2026-01-06 03:50:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:01.569534 | orchestrator | 2026-01-06 03:50:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:01.569566 | orchestrator | 2026-01-06 03:50:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:04.615395 | orchestrator | 2026-01-06 03:50:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:04.616803 | orchestrator | 2026-01-06 03:50:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:04.616831 | orchestrator | 2026-01-06 03:50:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:07.663770 | orchestrator | 2026-01-06 03:50:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:07.665582 | orchestrator | 2026-01-06 03:50:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:07.665627 | orchestrator | 2026-01-06 03:50:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:10.714774 | orchestrator | 2026-01-06 03:50:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:10.716769 | orchestrator | 2026-01-06 03:50:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:10.716811 | orchestrator | 2026-01-06 03:50:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:13.762197 | orchestrator | 2026-01-06 03:50:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:13.764678 | orchestrator | 2026-01-06 03:50:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:13.765392 | orchestrator | 2026-01-06 03:50:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:16.817170 | orchestrator | 2026-01-06 03:50:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:16.819234 | orchestrator | 2026-01-06 03:50:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:16.819353 | orchestrator | 2026-01-06 03:50:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:19.873652 | orchestrator | 2026-01-06 03:50:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:19.875032 | orchestrator | 2026-01-06 03:50:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:19.875075 | orchestrator | 2026-01-06 03:50:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:22.928607 | orchestrator | 2026-01-06 03:50:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:22.929746 | orchestrator | 2026-01-06 03:50:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:22.929783 | orchestrator | 2026-01-06 03:50:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:25.977132 | orchestrator | 2026-01-06 03:50:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:25.979609 | orchestrator | 2026-01-06 03:50:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:25.979647 | orchestrator | 2026-01-06 03:50:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:29.021777 | orchestrator | 2026-01-06 03:50:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:29.022723 | orchestrator | 2026-01-06 03:50:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:29.022755 | orchestrator | 2026-01-06 03:50:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:32.069010 | orchestrator | 2026-01-06 03:50:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:32.071635 | orchestrator | 2026-01-06 03:50:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:32.071816 | orchestrator | 2026-01-06 03:50:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:35.116323 | orchestrator | 2026-01-06 03:50:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:35.117133 | orchestrator | 2026-01-06 03:50:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:35.117156 | orchestrator | 2026-01-06 03:50:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:38.163247 | orchestrator | 2026-01-06 03:50:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:38.165220 | orchestrator | 2026-01-06 03:50:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:38.165327 | orchestrator | 2026-01-06 03:50:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:41.217440 | orchestrator | 2026-01-06 03:50:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:41.218094 | orchestrator | 2026-01-06 03:50:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:41.218131 | orchestrator | 2026-01-06 03:50:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:44.266899 | orchestrator | 2026-01-06 03:50:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:44.267952 | orchestrator | 2026-01-06 03:50:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:44.268094 | orchestrator | 2026-01-06 03:50:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:47.308242 | orchestrator | 2026-01-06 03:50:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:47.310791 | orchestrator | 2026-01-06 03:50:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:47.310900 | orchestrator | 2026-01-06 03:50:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:50.356398 | orchestrator | 2026-01-06 03:50:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:50.358355 | orchestrator | 2026-01-06 03:50:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:50.358409 | orchestrator | 2026-01-06 03:50:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:53.406782 | orchestrator | 2026-01-06 03:50:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:53.408018 | orchestrator | 2026-01-06 03:50:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:53.408052 | orchestrator | 2026-01-06 03:50:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:56.451623 | orchestrator | 2026-01-06 03:50:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:56.452944 | orchestrator | 2026-01-06 03:50:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:56.452976 | orchestrator | 2026-01-06 03:50:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:50:59.495703 | orchestrator | 2026-01-06 03:50:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:50:59.497618 | orchestrator | 2026-01-06 03:50:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:50:59.497658 | orchestrator | 2026-01-06 03:50:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:02.544947 | orchestrator | 2026-01-06 03:51:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:02.546678 | orchestrator | 2026-01-06 03:51:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:02.546722 | orchestrator | 2026-01-06 03:51:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:05.597057 | orchestrator | 2026-01-06 03:51:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:05.598463 | orchestrator | 2026-01-06 03:51:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:05.598500 | orchestrator | 2026-01-06 03:51:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:08.650961 | orchestrator | 2026-01-06 03:51:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:08.652870 | orchestrator | 2026-01-06 03:51:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:08.652946 | orchestrator | 2026-01-06 03:51:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:11.701094 | orchestrator | 2026-01-06 03:51:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:11.702993 | orchestrator | 2026-01-06 03:51:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:11.703150 | orchestrator | 2026-01-06 03:51:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:14.757169 | orchestrator | 2026-01-06 03:51:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:14.758969 | orchestrator | 2026-01-06 03:51:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:14.759031 | orchestrator | 2026-01-06 03:51:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:17.804583 | orchestrator | 2026-01-06 03:51:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:17.805219 | orchestrator | 2026-01-06 03:51:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:17.805276 | orchestrator | 2026-01-06 03:51:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:20.853902 | orchestrator | 2026-01-06 03:51:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:20.856077 | orchestrator | 2026-01-06 03:51:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:20.856114 | orchestrator | 2026-01-06 03:51:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:23.901420 | orchestrator | 2026-01-06 03:51:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:23.901880 | orchestrator | 2026-01-06 03:51:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:23.901911 | orchestrator | 2026-01-06 03:51:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:26.949164 | orchestrator | 2026-01-06 03:51:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:26.950975 | orchestrator | 2026-01-06 03:51:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:26.951064 | orchestrator | 2026-01-06 03:51:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:30.006754 | orchestrator | 2026-01-06 03:51:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:30.008532 | orchestrator | 2026-01-06 03:51:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:30.008601 | orchestrator | 2026-01-06 03:51:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:33.058488 | orchestrator | 2026-01-06 03:51:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:33.060455 | orchestrator | 2026-01-06 03:51:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:33.060491 | orchestrator | 2026-01-06 03:51:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:36.107970 | orchestrator | 2026-01-06 03:51:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:36.110670 | orchestrator | 2026-01-06 03:51:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:36.110848 | orchestrator | 2026-01-06 03:51:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:39.162402 | orchestrator | 2026-01-06 03:51:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:39.164708 | orchestrator | 2026-01-06 03:51:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:39.164947 | orchestrator | 2026-01-06 03:51:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:42.210238 | orchestrator | 2026-01-06 03:51:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:42.211654 | orchestrator | 2026-01-06 03:51:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:42.211714 | orchestrator | 2026-01-06 03:51:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:45.257879 | orchestrator | 2026-01-06 03:51:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:45.259694 | orchestrator | 2026-01-06 03:51:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:45.259814 | orchestrator | 2026-01-06 03:51:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:48.309553 | orchestrator | 2026-01-06 03:51:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:48.311314 | orchestrator | 2026-01-06 03:51:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:48.311429 | orchestrator | 2026-01-06 03:51:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:51.358241 | orchestrator | 2026-01-06 03:51:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:51.360425 | orchestrator | 2026-01-06 03:51:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:51.360523 | orchestrator | 2026-01-06 03:51:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:54.415891 | orchestrator | 2026-01-06 03:51:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:54.417787 | orchestrator | 2026-01-06 03:51:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:54.417863 | orchestrator | 2026-01-06 03:51:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:51:57.462558 | orchestrator | 2026-01-06 03:51:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:51:57.464027 | orchestrator | 2026-01-06 03:51:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:51:57.464052 | orchestrator | 2026-01-06 03:51:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:00.511192 | orchestrator | 2026-01-06 03:52:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:00.512720 | orchestrator | 2026-01-06 03:52:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:00.512806 | orchestrator | 2026-01-06 03:52:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:03.561835 | orchestrator | 2026-01-06 03:52:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:03.563819 | orchestrator | 2026-01-06 03:52:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:03.563871 | orchestrator | 2026-01-06 03:52:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:06.604942 | orchestrator | 2026-01-06 03:52:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:06.606442 | orchestrator | 2026-01-06 03:52:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:06.606617 | orchestrator | 2026-01-06 03:52:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:09.653139 | orchestrator | 2026-01-06 03:52:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:09.655928 | orchestrator | 2026-01-06 03:52:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:09.656019 | orchestrator | 2026-01-06 03:52:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:12.702430 | orchestrator | 2026-01-06 03:52:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:12.704074 | orchestrator | 2026-01-06 03:52:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:12.704124 | orchestrator | 2026-01-06 03:52:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:15.753166 | orchestrator | 2026-01-06 03:52:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:15.755140 | orchestrator | 2026-01-06 03:52:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:15.755228 | orchestrator | 2026-01-06 03:52:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:18.805845 | orchestrator | 2026-01-06 03:52:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:18.807393 | orchestrator | 2026-01-06 03:52:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:18.807443 | orchestrator | 2026-01-06 03:52:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:21.862412 | orchestrator | 2026-01-06 03:52:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:21.863767 | orchestrator | 2026-01-06 03:52:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:21.863815 | orchestrator | 2026-01-06 03:52:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:24.919383 | orchestrator | 2026-01-06 03:52:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:24.919602 | orchestrator | 2026-01-06 03:52:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:24.919619 | orchestrator | 2026-01-06 03:52:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:27.969247 | orchestrator | 2026-01-06 03:52:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:27.970470 | orchestrator | 2026-01-06 03:52:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:27.970709 | orchestrator | 2026-01-06 03:52:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:31.019145 | orchestrator | 2026-01-06 03:52:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:31.020328 | orchestrator | 2026-01-06 03:52:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:31.020423 | orchestrator | 2026-01-06 03:52:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:34.070587 | orchestrator | 2026-01-06 03:52:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:34.071278 | orchestrator | 2026-01-06 03:52:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:34.071318 | orchestrator | 2026-01-06 03:52:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:37.125180 | orchestrator | 2026-01-06 03:52:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:37.127277 | orchestrator | 2026-01-06 03:52:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:37.127375 | orchestrator | 2026-01-06 03:52:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:40.178739 | orchestrator | 2026-01-06 03:52:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:40.181754 | orchestrator | 2026-01-06 03:52:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:40.181802 | orchestrator | 2026-01-06 03:52:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:43.229539 | orchestrator | 2026-01-06 03:52:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:43.232078 | orchestrator | 2026-01-06 03:52:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:43.232142 | orchestrator | 2026-01-06 03:52:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:46.284172 | orchestrator | 2026-01-06 03:52:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:46.286386 | orchestrator | 2026-01-06 03:52:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:46.286481 | orchestrator | 2026-01-06 03:52:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:49.338884 | orchestrator | 2026-01-06 03:52:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:49.341812 | orchestrator | 2026-01-06 03:52:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:49.341906 | orchestrator | 2026-01-06 03:52:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:52.395422 | orchestrator | 2026-01-06 03:52:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:52.397903 | orchestrator | 2026-01-06 03:52:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:52.398164 | orchestrator | 2026-01-06 03:52:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:55.450315 | orchestrator | 2026-01-06 03:52:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:55.452808 | orchestrator | 2026-01-06 03:52:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:55.452857 | orchestrator | 2026-01-06 03:52:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:52:58.508727 | orchestrator | 2026-01-06 03:52:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:52:58.508892 | orchestrator | 2026-01-06 03:52:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:52:58.508910 | orchestrator | 2026-01-06 03:52:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:01.562343 | orchestrator | 2026-01-06 03:53:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:01.564598 | orchestrator | 2026-01-06 03:53:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:01.564648 | orchestrator | 2026-01-06 03:53:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:04.614815 | orchestrator | 2026-01-06 03:53:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:04.615804 | orchestrator | 2026-01-06 03:53:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:04.615841 | orchestrator | 2026-01-06 03:53:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:07.663049 | orchestrator | 2026-01-06 03:53:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:07.664156 | orchestrator | 2026-01-06 03:53:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:07.664214 | orchestrator | 2026-01-06 03:53:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:10.714431 | orchestrator | 2026-01-06 03:53:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:10.715614 | orchestrator | 2026-01-06 03:53:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:10.715665 | orchestrator | 2026-01-06 03:53:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:13.755003 | orchestrator | 2026-01-06 03:53:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:13.755754 | orchestrator | 2026-01-06 03:53:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:13.755798 | orchestrator | 2026-01-06 03:53:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:16.804081 | orchestrator | 2026-01-06 03:53:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:16.805692 | orchestrator | 2026-01-06 03:53:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:16.805748 | orchestrator | 2026-01-06 03:53:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:19.857186 | orchestrator | 2026-01-06 03:53:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:19.858514 | orchestrator | 2026-01-06 03:53:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:19.858670 | orchestrator | 2026-01-06 03:53:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:22.909275 | orchestrator | 2026-01-06 03:53:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:22.911201 | orchestrator | 2026-01-06 03:53:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:22.911465 | orchestrator | 2026-01-06 03:53:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:25.961949 | orchestrator | 2026-01-06 03:53:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:25.964016 | orchestrator | 2026-01-06 03:53:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:25.964122 | orchestrator | 2026-01-06 03:53:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:29.013680 | orchestrator | 2026-01-06 03:53:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:29.014538 | orchestrator | 2026-01-06 03:53:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:29.014585 | orchestrator | 2026-01-06 03:53:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:32.062535 | orchestrator | 2026-01-06 03:53:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:32.064315 | orchestrator | 2026-01-06 03:53:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:32.064401 | orchestrator | 2026-01-06 03:53:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:35.112605 | orchestrator | 2026-01-06 03:53:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:35.114805 | orchestrator | 2026-01-06 03:53:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:35.114840 | orchestrator | 2026-01-06 03:53:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:38.166572 | orchestrator | 2026-01-06 03:53:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:38.167873 | orchestrator | 2026-01-06 03:53:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:38.167907 | orchestrator | 2026-01-06 03:53:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:41.219183 | orchestrator | 2026-01-06 03:53:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:41.220664 | orchestrator | 2026-01-06 03:53:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:41.220682 | orchestrator | 2026-01-06 03:53:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:44.270345 | orchestrator | 2026-01-06 03:53:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:44.272565 | orchestrator | 2026-01-06 03:53:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:44.272680 | orchestrator | 2026-01-06 03:53:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:47.329780 | orchestrator | 2026-01-06 03:53:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:47.331565 | orchestrator | 2026-01-06 03:53:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:47.331631 | orchestrator | 2026-01-06 03:53:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:50.377315 | orchestrator | 2026-01-06 03:53:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:50.379794 | orchestrator | 2026-01-06 03:53:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:50.379837 | orchestrator | 2026-01-06 03:53:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:53.428008 | orchestrator | 2026-01-06 03:53:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:53.429678 | orchestrator | 2026-01-06 03:53:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:53.429828 | orchestrator | 2026-01-06 03:53:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:56.481712 | orchestrator | 2026-01-06 03:53:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:56.483598 | orchestrator | 2026-01-06 03:53:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:56.483660 | orchestrator | 2026-01-06 03:53:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:53:59.535003 | orchestrator | 2026-01-06 03:53:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:53:59.536612 | orchestrator | 2026-01-06 03:53:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:53:59.536725 | orchestrator | 2026-01-06 03:53:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:02.587181 | orchestrator | 2026-01-06 03:54:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:02.588487 | orchestrator | 2026-01-06 03:54:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:02.588524 | orchestrator | 2026-01-06 03:54:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:05.640528 | orchestrator | 2026-01-06 03:54:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:05.641212 | orchestrator | 2026-01-06 03:54:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:05.641339 | orchestrator | 2026-01-06 03:54:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:08.690595 | orchestrator | 2026-01-06 03:54:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:08.691741 | orchestrator | 2026-01-06 03:54:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:08.691965 | orchestrator | 2026-01-06 03:54:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:11.743575 | orchestrator | 2026-01-06 03:54:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:11.744722 | orchestrator | 2026-01-06 03:54:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:11.744769 | orchestrator | 2026-01-06 03:54:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:14.796332 | orchestrator | 2026-01-06 03:54:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:14.797250 | orchestrator | 2026-01-06 03:54:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:14.797889 | orchestrator | 2026-01-06 03:54:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:17.850331 | orchestrator | 2026-01-06 03:54:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:17.851694 | orchestrator | 2026-01-06 03:54:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:17.851777 | orchestrator | 2026-01-06 03:54:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:20.900954 | orchestrator | 2026-01-06 03:54:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:20.903095 | orchestrator | 2026-01-06 03:54:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:20.903270 | orchestrator | 2026-01-06 03:54:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:23.951163 | orchestrator | 2026-01-06 03:54:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:23.952744 | orchestrator | 2026-01-06 03:54:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:23.952784 | orchestrator | 2026-01-06 03:54:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:26.999576 | orchestrator | 2026-01-06 03:54:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:27.001376 | orchestrator | 2026-01-06 03:54:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:27.001457 | orchestrator | 2026-01-06 03:54:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:30.051962 | orchestrator | 2026-01-06 03:54:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:30.053348 | orchestrator | 2026-01-06 03:54:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:30.053551 | orchestrator | 2026-01-06 03:54:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:33.109658 | orchestrator | 2026-01-06 03:54:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:33.110399 | orchestrator | 2026-01-06 03:54:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:33.110453 | orchestrator | 2026-01-06 03:54:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:36.156883 | orchestrator | 2026-01-06 03:54:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:36.158663 | orchestrator | 2026-01-06 03:54:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:36.158730 | orchestrator | 2026-01-06 03:54:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:39.204877 | orchestrator | 2026-01-06 03:54:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:39.206771 | orchestrator | 2026-01-06 03:54:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:39.206828 | orchestrator | 2026-01-06 03:54:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:42.254273 | orchestrator | 2026-01-06 03:54:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:42.255935 | orchestrator | 2026-01-06 03:54:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:42.256042 | orchestrator | 2026-01-06 03:54:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:45.302010 | orchestrator | 2026-01-06 03:54:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:45.303680 | orchestrator | 2026-01-06 03:54:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:45.303845 | orchestrator | 2026-01-06 03:54:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:48.351356 | orchestrator | 2026-01-06 03:54:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:48.353391 | orchestrator | 2026-01-06 03:54:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:48.353512 | orchestrator | 2026-01-06 03:54:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:51.407190 | orchestrator | 2026-01-06 03:54:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:51.409906 | orchestrator | 2026-01-06 03:54:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:51.410000 | orchestrator | 2026-01-06 03:54:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:54.462998 | orchestrator | 2026-01-06 03:54:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:54.465418 | orchestrator | 2026-01-06 03:54:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:54.465517 | orchestrator | 2026-01-06 03:54:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:54:57.518508 | orchestrator | 2026-01-06 03:54:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:54:57.519591 | orchestrator | 2026-01-06 03:54:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:54:57.519665 | orchestrator | 2026-01-06 03:54:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:00.575683 | orchestrator | 2026-01-06 03:55:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:00.577421 | orchestrator | 2026-01-06 03:55:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:00.577521 | orchestrator | 2026-01-06 03:55:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:03.627518 | orchestrator | 2026-01-06 03:55:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:03.628911 | orchestrator | 2026-01-06 03:55:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:03.629011 | orchestrator | 2026-01-06 03:55:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:06.671036 | orchestrator | 2026-01-06 03:55:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:06.672534 | orchestrator | 2026-01-06 03:55:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:06.672609 | orchestrator | 2026-01-06 03:55:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:09.721790 | orchestrator | 2026-01-06 03:55:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:09.723811 | orchestrator | 2026-01-06 03:55:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:09.724007 | orchestrator | 2026-01-06 03:55:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:12.775574 | orchestrator | 2026-01-06 03:55:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:12.777276 | orchestrator | 2026-01-06 03:55:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:12.777334 | orchestrator | 2026-01-06 03:55:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:15.830307 | orchestrator | 2026-01-06 03:55:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:15.831642 | orchestrator | 2026-01-06 03:55:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:15.831671 | orchestrator | 2026-01-06 03:55:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:18.879068 | orchestrator | 2026-01-06 03:55:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:18.880895 | orchestrator | 2026-01-06 03:55:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:18.880999 | orchestrator | 2026-01-06 03:55:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:21.929855 | orchestrator | 2026-01-06 03:55:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:21.931193 | orchestrator | 2026-01-06 03:55:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:21.931282 | orchestrator | 2026-01-06 03:55:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:24.985113 | orchestrator | 2026-01-06 03:55:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:24.986202 | orchestrator | 2026-01-06 03:55:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:24.986242 | orchestrator | 2026-01-06 03:55:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:28.037071 | orchestrator | 2026-01-06 03:55:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:28.037247 | orchestrator | 2026-01-06 03:55:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:28.037262 | orchestrator | 2026-01-06 03:55:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:31.086793 | orchestrator | 2026-01-06 03:55:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:31.088685 | orchestrator | 2026-01-06 03:55:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:31.088815 | orchestrator | 2026-01-06 03:55:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:34.137350 | orchestrator | 2026-01-06 03:55:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:34.137677 | orchestrator | 2026-01-06 03:55:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:34.138088 | orchestrator | 2026-01-06 03:55:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:37.188256 | orchestrator | 2026-01-06 03:55:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:37.189974 | orchestrator | 2026-01-06 03:55:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:37.190128 | orchestrator | 2026-01-06 03:55:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:40.238779 | orchestrator | 2026-01-06 03:55:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:40.240959 | orchestrator | 2026-01-06 03:55:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:40.241030 | orchestrator | 2026-01-06 03:55:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:43.285831 | orchestrator | 2026-01-06 03:55:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:43.287516 | orchestrator | 2026-01-06 03:55:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:43.287667 | orchestrator | 2026-01-06 03:55:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:46.335857 | orchestrator | 2026-01-06 03:55:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:46.338793 | orchestrator | 2026-01-06 03:55:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:46.338846 | orchestrator | 2026-01-06 03:55:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:49.393869 | orchestrator | 2026-01-06 03:55:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:49.395269 | orchestrator | 2026-01-06 03:55:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:49.395299 | orchestrator | 2026-01-06 03:55:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:52.444924 | orchestrator | 2026-01-06 03:55:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:52.446868 | orchestrator | 2026-01-06 03:55:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:52.447378 | orchestrator | 2026-01-06 03:55:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:55.501784 | orchestrator | 2026-01-06 03:55:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:55.503123 | orchestrator | 2026-01-06 03:55:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:55.503149 | orchestrator | 2026-01-06 03:55:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:55:58.549756 | orchestrator | 2026-01-06 03:55:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:55:58.551200 | orchestrator | 2026-01-06 03:55:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:55:58.551294 | orchestrator | 2026-01-06 03:55:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:01.601440 | orchestrator | 2026-01-06 03:56:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:01.603341 | orchestrator | 2026-01-06 03:56:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:01.603493 | orchestrator | 2026-01-06 03:56:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:04.648886 | orchestrator | 2026-01-06 03:56:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:04.650945 | orchestrator | 2026-01-06 03:56:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:04.651056 | orchestrator | 2026-01-06 03:56:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:07.697316 | orchestrator | 2026-01-06 03:56:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:07.699068 | orchestrator | 2026-01-06 03:56:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:07.699110 | orchestrator | 2026-01-06 03:56:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:10.745678 | orchestrator | 2026-01-06 03:56:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:10.746753 | orchestrator | 2026-01-06 03:56:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:10.746845 | orchestrator | 2026-01-06 03:56:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:13.794208 | orchestrator | 2026-01-06 03:56:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:13.794731 | orchestrator | 2026-01-06 03:56:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:13.795082 | orchestrator | 2026-01-06 03:56:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:16.837883 | orchestrator | 2026-01-06 03:56:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:16.838737 | orchestrator | 2026-01-06 03:56:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:16.839250 | orchestrator | 2026-01-06 03:56:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:19.890553 | orchestrator | 2026-01-06 03:56:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:19.892708 | orchestrator | 2026-01-06 03:56:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:19.892741 | orchestrator | 2026-01-06 03:56:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:22.943736 | orchestrator | 2026-01-06 03:56:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:22.944939 | orchestrator | 2026-01-06 03:56:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:22.945085 | orchestrator | 2026-01-06 03:56:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:25.995191 | orchestrator | 2026-01-06 03:56:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:25.999090 | orchestrator | 2026-01-06 03:56:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:25.999158 | orchestrator | 2026-01-06 03:56:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:29.059547 | orchestrator | 2026-01-06 03:56:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:29.062556 | orchestrator | 2026-01-06 03:56:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:29.063774 | orchestrator | 2026-01-06 03:56:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:32.115032 | orchestrator | 2026-01-06 03:56:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:32.117399 | orchestrator | 2026-01-06 03:56:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:32.117440 | orchestrator | 2026-01-06 03:56:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:35.166316 | orchestrator | 2026-01-06 03:56:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:35.167479 | orchestrator | 2026-01-06 03:56:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:35.167525 | orchestrator | 2026-01-06 03:56:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:38.214325 | orchestrator | 2026-01-06 03:56:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:38.217408 | orchestrator | 2026-01-06 03:56:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:38.217457 | orchestrator | 2026-01-06 03:56:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:41.264275 | orchestrator | 2026-01-06 03:56:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:41.265764 | orchestrator | 2026-01-06 03:56:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:41.265925 | orchestrator | 2026-01-06 03:56:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:44.313653 | orchestrator | 2026-01-06 03:56:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:44.315226 | orchestrator | 2026-01-06 03:56:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:44.315279 | orchestrator | 2026-01-06 03:56:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:47.367266 | orchestrator | 2026-01-06 03:56:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:47.368544 | orchestrator | 2026-01-06 03:56:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:47.368585 | orchestrator | 2026-01-06 03:56:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:50.415069 | orchestrator | 2026-01-06 03:56:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:50.417076 | orchestrator | 2026-01-06 03:56:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:50.417097 | orchestrator | 2026-01-06 03:56:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:53.465946 | orchestrator | 2026-01-06 03:56:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:53.467609 | orchestrator | 2026-01-06 03:56:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:53.467714 | orchestrator | 2026-01-06 03:56:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:56.518287 | orchestrator | 2026-01-06 03:56:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:56.520471 | orchestrator | 2026-01-06 03:56:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:56.520636 | orchestrator | 2026-01-06 03:56:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:56:59.567612 | orchestrator | 2026-01-06 03:56:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:56:59.569483 | orchestrator | 2026-01-06 03:56:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:56:59.569659 | orchestrator | 2026-01-06 03:56:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:02.618325 | orchestrator | 2026-01-06 03:57:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:02.620445 | orchestrator | 2026-01-06 03:57:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:02.620485 | orchestrator | 2026-01-06 03:57:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:05.662002 | orchestrator | 2026-01-06 03:57:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:05.664886 | orchestrator | 2026-01-06 03:57:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:05.665007 | orchestrator | 2026-01-06 03:57:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:08.710296 | orchestrator | 2026-01-06 03:57:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:08.712134 | orchestrator | 2026-01-06 03:57:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:08.712180 | orchestrator | 2026-01-06 03:57:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:11.758661 | orchestrator | 2026-01-06 03:57:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:11.759895 | orchestrator | 2026-01-06 03:57:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:11.760016 | orchestrator | 2026-01-06 03:57:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:14.809704 | orchestrator | 2026-01-06 03:57:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:14.811941 | orchestrator | 2026-01-06 03:57:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:14.812415 | orchestrator | 2026-01-06 03:57:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:17.858809 | orchestrator | 2026-01-06 03:57:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:17.859600 | orchestrator | 2026-01-06 03:57:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:17.859657 | orchestrator | 2026-01-06 03:57:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:20.914158 | orchestrator | 2026-01-06 03:57:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:20.916183 | orchestrator | 2026-01-06 03:57:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:20.916221 | orchestrator | 2026-01-06 03:57:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:23.966437 | orchestrator | 2026-01-06 03:57:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:23.967137 | orchestrator | 2026-01-06 03:57:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:23.967178 | orchestrator | 2026-01-06 03:57:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:27.026183 | orchestrator | 2026-01-06 03:57:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:27.027813 | orchestrator | 2026-01-06 03:57:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:27.027972 | orchestrator | 2026-01-06 03:57:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:30.072415 | orchestrator | 2026-01-06 03:57:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:30.073762 | orchestrator | 2026-01-06 03:57:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:30.073806 | orchestrator | 2026-01-06 03:57:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:33.123688 | orchestrator | 2026-01-06 03:57:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:33.124125 | orchestrator | 2026-01-06 03:57:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:33.124161 | orchestrator | 2026-01-06 03:57:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:36.167203 | orchestrator | 2026-01-06 03:57:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:36.168785 | orchestrator | 2026-01-06 03:57:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:36.169398 | orchestrator | 2026-01-06 03:57:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:39.212866 | orchestrator | 2026-01-06 03:57:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:39.213990 | orchestrator | 2026-01-06 03:57:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:39.214077 | orchestrator | 2026-01-06 03:57:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:42.263213 | orchestrator | 2026-01-06 03:57:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:42.265344 | orchestrator | 2026-01-06 03:57:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:42.265632 | orchestrator | 2026-01-06 03:57:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:45.308909 | orchestrator | 2026-01-06 03:57:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:45.310212 | orchestrator | 2026-01-06 03:57:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:45.310256 | orchestrator | 2026-01-06 03:57:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:48.362432 | orchestrator | 2026-01-06 03:57:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:48.365097 | orchestrator | 2026-01-06 03:57:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:48.365146 | orchestrator | 2026-01-06 03:57:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:51.418373 | orchestrator | 2026-01-06 03:57:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:51.421484 | orchestrator | 2026-01-06 03:57:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:51.421687 | orchestrator | 2026-01-06 03:57:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:54.462876 | orchestrator | 2026-01-06 03:57:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:54.463915 | orchestrator | 2026-01-06 03:57:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:54.463975 | orchestrator | 2026-01-06 03:57:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:57:57.511090 | orchestrator | 2026-01-06 03:57:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:57:57.512988 | orchestrator | 2026-01-06 03:57:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:57:57.513054 | orchestrator | 2026-01-06 03:57:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:00.561628 | orchestrator | 2026-01-06 03:58:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:00.563016 | orchestrator | 2026-01-06 03:58:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:00.563063 | orchestrator | 2026-01-06 03:58:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:03.614383 | orchestrator | 2026-01-06 03:58:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:03.615319 | orchestrator | 2026-01-06 03:58:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:03.615438 | orchestrator | 2026-01-06 03:58:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:06.662837 | orchestrator | 2026-01-06 03:58:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:06.664249 | orchestrator | 2026-01-06 03:58:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:06.664393 | orchestrator | 2026-01-06 03:58:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:09.713203 | orchestrator | 2026-01-06 03:58:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:09.715204 | orchestrator | 2026-01-06 03:58:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:09.715295 | orchestrator | 2026-01-06 03:58:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:12.768585 | orchestrator | 2026-01-06 03:58:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:12.770293 | orchestrator | 2026-01-06 03:58:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:12.770662 | orchestrator | 2026-01-06 03:58:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:15.822499 | orchestrator | 2026-01-06 03:58:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:15.823733 | orchestrator | 2026-01-06 03:58:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:15.823777 | orchestrator | 2026-01-06 03:58:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:18.878464 | orchestrator | 2026-01-06 03:58:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:18.879880 | orchestrator | 2026-01-06 03:58:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:18.879928 | orchestrator | 2026-01-06 03:58:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:21.930756 | orchestrator | 2026-01-06 03:58:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:21.932309 | orchestrator | 2026-01-06 03:58:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:21.932351 | orchestrator | 2026-01-06 03:58:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:24.983133 | orchestrator | 2026-01-06 03:58:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:24.984515 | orchestrator | 2026-01-06 03:58:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:24.984604 | orchestrator | 2026-01-06 03:58:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:28.033104 | orchestrator | 2026-01-06 03:58:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:28.034835 | orchestrator | 2026-01-06 03:58:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:28.034871 | orchestrator | 2026-01-06 03:58:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:31.083634 | orchestrator | 2026-01-06 03:58:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:31.087281 | orchestrator | 2026-01-06 03:58:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:31.088075 | orchestrator | 2026-01-06 03:58:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:34.142922 | orchestrator | 2026-01-06 03:58:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:34.144924 | orchestrator | 2026-01-06 03:58:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:34.144970 | orchestrator | 2026-01-06 03:58:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:37.193673 | orchestrator | 2026-01-06 03:58:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:37.195367 | orchestrator | 2026-01-06 03:58:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:37.195445 | orchestrator | 2026-01-06 03:58:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:40.240108 | orchestrator | 2026-01-06 03:58:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:40.241073 | orchestrator | 2026-01-06 03:58:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:40.241169 | orchestrator | 2026-01-06 03:58:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:43.291028 | orchestrator | 2026-01-06 03:58:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:43.292858 | orchestrator | 2026-01-06 03:58:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:43.293040 | orchestrator | 2026-01-06 03:58:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:46.340825 | orchestrator | 2026-01-06 03:58:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:46.342239 | orchestrator | 2026-01-06 03:58:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:46.342280 | orchestrator | 2026-01-06 03:58:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:49.389723 | orchestrator | 2026-01-06 03:58:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:49.391538 | orchestrator | 2026-01-06 03:58:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:49.391603 | orchestrator | 2026-01-06 03:58:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:52.437657 | orchestrator | 2026-01-06 03:58:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:52.439771 | orchestrator | 2026-01-06 03:58:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:52.439806 | orchestrator | 2026-01-06 03:58:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:55.488943 | orchestrator | 2026-01-06 03:58:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:55.490475 | orchestrator | 2026-01-06 03:58:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:55.490535 | orchestrator | 2026-01-06 03:58:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:58:58.536675 | orchestrator | 2026-01-06 03:58:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:58:58.538922 | orchestrator | 2026-01-06 03:58:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:58:58.539010 | orchestrator | 2026-01-06 03:58:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:01.585188 | orchestrator | 2026-01-06 03:59:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:01.587180 | orchestrator | 2026-01-06 03:59:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:01.587255 | orchestrator | 2026-01-06 03:59:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:04.631963 | orchestrator | 2026-01-06 03:59:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:04.634162 | orchestrator | 2026-01-06 03:59:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:04.634338 | orchestrator | 2026-01-06 03:59:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:07.682658 | orchestrator | 2026-01-06 03:59:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:07.683056 | orchestrator | 2026-01-06 03:59:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:07.683090 | orchestrator | 2026-01-06 03:59:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:10.731343 | orchestrator | 2026-01-06 03:59:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:10.732984 | orchestrator | 2026-01-06 03:59:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:10.733036 | orchestrator | 2026-01-06 03:59:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:13.777961 | orchestrator | 2026-01-06 03:59:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:13.781286 | orchestrator | 2026-01-06 03:59:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:13.781338 | orchestrator | 2026-01-06 03:59:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:16.832365 | orchestrator | 2026-01-06 03:59:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:16.834085 | orchestrator | 2026-01-06 03:59:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:16.834192 | orchestrator | 2026-01-06 03:59:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:19.881251 | orchestrator | 2026-01-06 03:59:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:19.884097 | orchestrator | 2026-01-06 03:59:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:19.884173 | orchestrator | 2026-01-06 03:59:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:22.933818 | orchestrator | 2026-01-06 03:59:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:22.934921 | orchestrator | 2026-01-06 03:59:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:22.934974 | orchestrator | 2026-01-06 03:59:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:25.984512 | orchestrator | 2026-01-06 03:59:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:25.987490 | orchestrator | 2026-01-06 03:59:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:25.987619 | orchestrator | 2026-01-06 03:59:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:29.032119 | orchestrator | 2026-01-06 03:59:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:29.032693 | orchestrator | 2026-01-06 03:59:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:29.032713 | orchestrator | 2026-01-06 03:59:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:32.084979 | orchestrator | 2026-01-06 03:59:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:32.088107 | orchestrator | 2026-01-06 03:59:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:32.088216 | orchestrator | 2026-01-06 03:59:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:35.132000 | orchestrator | 2026-01-06 03:59:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:35.133388 | orchestrator | 2026-01-06 03:59:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:35.133526 | orchestrator | 2026-01-06 03:59:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:38.180798 | orchestrator | 2026-01-06 03:59:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:38.182006 | orchestrator | 2026-01-06 03:59:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:38.182137 | orchestrator | 2026-01-06 03:59:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:41.230809 | orchestrator | 2026-01-06 03:59:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:41.232077 | orchestrator | 2026-01-06 03:59:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:41.232112 | orchestrator | 2026-01-06 03:59:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:44.278312 | orchestrator | 2026-01-06 03:59:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:44.280267 | orchestrator | 2026-01-06 03:59:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:44.280309 | orchestrator | 2026-01-06 03:59:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:47.328854 | orchestrator | 2026-01-06 03:59:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:47.330096 | orchestrator | 2026-01-06 03:59:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:47.330310 | orchestrator | 2026-01-06 03:59:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:50.378378 | orchestrator | 2026-01-06 03:59:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:50.380753 | orchestrator | 2026-01-06 03:59:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:50.380824 | orchestrator | 2026-01-06 03:59:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:53.430337 | orchestrator | 2026-01-06 03:59:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:53.432215 | orchestrator | 2026-01-06 03:59:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:53.432261 | orchestrator | 2026-01-06 03:59:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:56.477505 | orchestrator | 2026-01-06 03:59:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:56.479259 | orchestrator | 2026-01-06 03:59:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:56.479337 | orchestrator | 2026-01-06 03:59:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 03:59:59.526325 | orchestrator | 2026-01-06 03:59:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 03:59:59.528464 | orchestrator | 2026-01-06 03:59:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 03:59:59.528549 | orchestrator | 2026-01-06 03:59:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:02.576155 | orchestrator | 2026-01-06 04:00:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:02.577859 | orchestrator | 2026-01-06 04:00:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:02.578007 | orchestrator | 2026-01-06 04:00:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:05.623029 | orchestrator | 2026-01-06 04:00:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:05.624277 | orchestrator | 2026-01-06 04:00:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:05.624303 | orchestrator | 2026-01-06 04:00:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:08.667628 | orchestrator | 2026-01-06 04:00:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:08.669210 | orchestrator | 2026-01-06 04:00:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:08.669271 | orchestrator | 2026-01-06 04:00:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:11.721942 | orchestrator | 2026-01-06 04:00:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:11.723465 | orchestrator | 2026-01-06 04:00:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:11.723505 | orchestrator | 2026-01-06 04:00:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:14.778443 | orchestrator | 2026-01-06 04:00:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:14.779462 | orchestrator | 2026-01-06 04:00:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:14.779663 | orchestrator | 2026-01-06 04:00:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:17.835311 | orchestrator | 2026-01-06 04:00:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:17.836526 | orchestrator | 2026-01-06 04:00:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:17.836557 | orchestrator | 2026-01-06 04:00:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:20.882230 | orchestrator | 2026-01-06 04:00:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:20.883756 | orchestrator | 2026-01-06 04:00:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:20.883807 | orchestrator | 2026-01-06 04:00:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:23.933686 | orchestrator | 2026-01-06 04:00:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:23.935929 | orchestrator | 2026-01-06 04:00:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:23.935983 | orchestrator | 2026-01-06 04:00:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:26.981551 | orchestrator | 2026-01-06 04:00:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:26.986263 | orchestrator | 2026-01-06 04:00:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:26.986365 | orchestrator | 2026-01-06 04:00:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:30.030583 | orchestrator | 2026-01-06 04:00:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:30.032212 | orchestrator | 2026-01-06 04:00:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:30.032290 | orchestrator | 2026-01-06 04:00:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:33.084823 | orchestrator | 2026-01-06 04:00:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:33.086921 | orchestrator | 2026-01-06 04:00:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:33.086986 | orchestrator | 2026-01-06 04:00:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:36.124521 | orchestrator | 2026-01-06 04:00:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:36.126646 | orchestrator | 2026-01-06 04:00:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:36.126724 | orchestrator | 2026-01-06 04:00:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:39.172993 | orchestrator | 2026-01-06 04:00:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:39.175296 | orchestrator | 2026-01-06 04:00:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:39.175855 | orchestrator | 2026-01-06 04:00:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:42.226465 | orchestrator | 2026-01-06 04:00:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:42.228416 | orchestrator | 2026-01-06 04:00:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:42.228469 | orchestrator | 2026-01-06 04:00:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:45.275486 | orchestrator | 2026-01-06 04:00:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:45.277227 | orchestrator | 2026-01-06 04:00:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:45.277286 | orchestrator | 2026-01-06 04:00:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:48.326343 | orchestrator | 2026-01-06 04:00:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:48.328265 | orchestrator | 2026-01-06 04:00:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:48.328312 | orchestrator | 2026-01-06 04:00:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:51.371821 | orchestrator | 2026-01-06 04:00:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:51.372598 | orchestrator | 2026-01-06 04:00:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:51.372666 | orchestrator | 2026-01-06 04:00:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:54.420565 | orchestrator | 2026-01-06 04:00:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:54.421883 | orchestrator | 2026-01-06 04:00:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:54.422318 | orchestrator | 2026-01-06 04:00:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:00:57.470340 | orchestrator | 2026-01-06 04:00:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:00:57.471891 | orchestrator | 2026-01-06 04:00:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:00:57.472005 | orchestrator | 2026-01-06 04:00:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:00.520560 | orchestrator | 2026-01-06 04:01:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:00.522485 | orchestrator | 2026-01-06 04:01:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:00.522532 | orchestrator | 2026-01-06 04:01:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:03.572128 | orchestrator | 2026-01-06 04:01:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:03.577556 | orchestrator | 2026-01-06 04:01:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:03.577706 | orchestrator | 2026-01-06 04:01:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:06.619175 | orchestrator | 2026-01-06 04:01:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:06.620865 | orchestrator | 2026-01-06 04:01:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:06.621007 | orchestrator | 2026-01-06 04:01:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:09.663448 | orchestrator | 2026-01-06 04:01:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:09.666195 | orchestrator | 2026-01-06 04:01:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:09.666522 | orchestrator | 2026-01-06 04:01:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:12.714201 | orchestrator | 2026-01-06 04:01:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:12.716001 | orchestrator | 2026-01-06 04:01:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:12.716044 | orchestrator | 2026-01-06 04:01:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:15.762081 | orchestrator | 2026-01-06 04:01:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:15.764850 | orchestrator | 2026-01-06 04:01:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:15.764905 | orchestrator | 2026-01-06 04:01:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:18.809465 | orchestrator | 2026-01-06 04:01:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:18.810754 | orchestrator | 2026-01-06 04:01:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:18.810789 | orchestrator | 2026-01-06 04:01:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:21.861596 | orchestrator | 2026-01-06 04:01:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:21.863110 | orchestrator | 2026-01-06 04:01:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:21.863162 | orchestrator | 2026-01-06 04:01:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:24.910688 | orchestrator | 2026-01-06 04:01:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:24.912980 | orchestrator | 2026-01-06 04:01:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:24.913047 | orchestrator | 2026-01-06 04:01:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:27.963473 | orchestrator | 2026-01-06 04:01:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:27.964840 | orchestrator | 2026-01-06 04:01:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:27.964883 | orchestrator | 2026-01-06 04:01:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:31.008431 | orchestrator | 2026-01-06 04:01:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:31.009032 | orchestrator | 2026-01-06 04:01:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:31.009316 | orchestrator | 2026-01-06 04:01:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:34.056962 | orchestrator | 2026-01-06 04:01:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:34.059177 | orchestrator | 2026-01-06 04:01:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:34.059249 | orchestrator | 2026-01-06 04:01:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:37.106116 | orchestrator | 2026-01-06 04:01:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:37.107756 | orchestrator | 2026-01-06 04:01:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:37.108362 | orchestrator | 2026-01-06 04:01:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:40.156397 | orchestrator | 2026-01-06 04:01:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:40.157973 | orchestrator | 2026-01-06 04:01:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:40.158211 | orchestrator | 2026-01-06 04:01:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:43.210392 | orchestrator | 2026-01-06 04:01:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:43.212115 | orchestrator | 2026-01-06 04:01:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:43.212148 | orchestrator | 2026-01-06 04:01:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:46.252230 | orchestrator | 2026-01-06 04:01:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:46.254382 | orchestrator | 2026-01-06 04:01:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:46.254463 | orchestrator | 2026-01-06 04:01:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:49.299950 | orchestrator | 2026-01-06 04:01:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:49.302100 | orchestrator | 2026-01-06 04:01:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:49.302156 | orchestrator | 2026-01-06 04:01:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:52.348325 | orchestrator | 2026-01-06 04:01:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:52.349919 | orchestrator | 2026-01-06 04:01:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:52.350697 | orchestrator | 2026-01-06 04:01:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:55.398785 | orchestrator | 2026-01-06 04:01:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:55.399774 | orchestrator | 2026-01-06 04:01:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:55.399826 | orchestrator | 2026-01-06 04:01:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:01:58.448037 | orchestrator | 2026-01-06 04:01:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:01:58.449912 | orchestrator | 2026-01-06 04:01:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:01:58.449955 | orchestrator | 2026-01-06 04:01:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:01.494435 | orchestrator | 2026-01-06 04:02:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:01.495372 | orchestrator | 2026-01-06 04:02:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:01.495870 | orchestrator | 2026-01-06 04:02:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:04.547250 | orchestrator | 2026-01-06 04:02:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:04.549329 | orchestrator | 2026-01-06 04:02:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:04.549381 | orchestrator | 2026-01-06 04:02:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:07.596273 | orchestrator | 2026-01-06 04:02:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:07.597634 | orchestrator | 2026-01-06 04:02:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:07.597691 | orchestrator | 2026-01-06 04:02:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:10.642468 | orchestrator | 2026-01-06 04:02:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:10.644075 | orchestrator | 2026-01-06 04:02:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:10.644171 | orchestrator | 2026-01-06 04:02:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:13.692186 | orchestrator | 2026-01-06 04:02:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:13.697900 | orchestrator | 2026-01-06 04:02:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:13.697976 | orchestrator | 2026-01-06 04:02:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:16.742470 | orchestrator | 2026-01-06 04:02:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:16.744052 | orchestrator | 2026-01-06 04:02:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:16.744080 | orchestrator | 2026-01-06 04:02:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:19.794881 | orchestrator | 2026-01-06 04:02:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:19.796041 | orchestrator | 2026-01-06 04:02:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:19.796151 | orchestrator | 2026-01-06 04:02:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:22.841232 | orchestrator | 2026-01-06 04:02:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:22.843090 | orchestrator | 2026-01-06 04:02:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:22.843231 | orchestrator | 2026-01-06 04:02:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:25.892372 | orchestrator | 2026-01-06 04:02:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:25.895377 | orchestrator | 2026-01-06 04:02:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:25.895520 | orchestrator | 2026-01-06 04:02:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:28.943154 | orchestrator | 2026-01-06 04:02:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:28.943507 | orchestrator | 2026-01-06 04:02:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:28.943981 | orchestrator | 2026-01-06 04:02:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:31.982566 | orchestrator | 2026-01-06 04:02:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:31.984806 | orchestrator | 2026-01-06 04:02:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:31.985445 | orchestrator | 2026-01-06 04:02:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:35.056251 | orchestrator | 2026-01-06 04:02:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:35.057508 | orchestrator | 2026-01-06 04:02:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:35.057589 | orchestrator | 2026-01-06 04:02:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:38.101605 | orchestrator | 2026-01-06 04:02:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:38.103461 | orchestrator | 2026-01-06 04:02:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:38.103521 | orchestrator | 2026-01-06 04:02:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:41.152005 | orchestrator | 2026-01-06 04:02:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:41.153378 | orchestrator | 2026-01-06 04:02:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:41.154251 | orchestrator | 2026-01-06 04:02:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:44.202418 | orchestrator | 2026-01-06 04:02:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:44.204227 | orchestrator | 2026-01-06 04:02:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:44.204293 | orchestrator | 2026-01-06 04:02:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:47.256237 | orchestrator | 2026-01-06 04:02:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:47.258126 | orchestrator | 2026-01-06 04:02:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:47.258183 | orchestrator | 2026-01-06 04:02:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:50.299781 | orchestrator | 2026-01-06 04:02:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:50.300914 | orchestrator | 2026-01-06 04:02:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:50.300995 | orchestrator | 2026-01-06 04:02:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:53.350000 | orchestrator | 2026-01-06 04:02:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:53.351599 | orchestrator | 2026-01-06 04:02:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:53.351685 | orchestrator | 2026-01-06 04:02:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:56.397177 | orchestrator | 2026-01-06 04:02:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:56.399268 | orchestrator | 2026-01-06 04:02:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:56.399329 | orchestrator | 2026-01-06 04:02:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:02:59.452798 | orchestrator | 2026-01-06 04:02:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:02:59.454426 | orchestrator | 2026-01-06 04:02:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:02:59.454616 | orchestrator | 2026-01-06 04:02:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:02.508456 | orchestrator | 2026-01-06 04:03:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:02.509911 | orchestrator | 2026-01-06 04:03:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:02.510013 | orchestrator | 2026-01-06 04:03:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:05.564006 | orchestrator | 2026-01-06 04:03:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:05.565941 | orchestrator | 2026-01-06 04:03:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:05.566146 | orchestrator | 2026-01-06 04:03:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:08.615119 | orchestrator | 2026-01-06 04:03:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:08.616463 | orchestrator | 2026-01-06 04:03:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:08.616546 | orchestrator | 2026-01-06 04:03:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:11.659393 | orchestrator | 2026-01-06 04:03:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:11.661203 | orchestrator | 2026-01-06 04:03:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:11.661250 | orchestrator | 2026-01-06 04:03:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:14.734202 | orchestrator | 2026-01-06 04:03:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:14.734741 | orchestrator | 2026-01-06 04:03:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:14.734829 | orchestrator | 2026-01-06 04:03:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:17.784842 | orchestrator | 2026-01-06 04:03:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:17.787411 | orchestrator | 2026-01-06 04:03:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:17.787484 | orchestrator | 2026-01-06 04:03:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:20.837193 | orchestrator | 2026-01-06 04:03:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:20.838755 | orchestrator | 2026-01-06 04:03:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:20.838857 | orchestrator | 2026-01-06 04:03:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:23.887050 | orchestrator | 2026-01-06 04:03:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:23.889219 | orchestrator | 2026-01-06 04:03:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:23.889371 | orchestrator | 2026-01-06 04:03:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:26.944028 | orchestrator | 2026-01-06 04:03:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:26.945588 | orchestrator | 2026-01-06 04:03:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:26.945658 | orchestrator | 2026-01-06 04:03:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:29.992208 | orchestrator | 2026-01-06 04:03:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:29.992521 | orchestrator | 2026-01-06 04:03:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:29.992550 | orchestrator | 2026-01-06 04:03:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:33.045675 | orchestrator | 2026-01-06 04:03:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:33.047056 | orchestrator | 2026-01-06 04:03:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:33.047483 | orchestrator | 2026-01-06 04:03:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:36.103748 | orchestrator | 2026-01-06 04:03:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:36.104740 | orchestrator | 2026-01-06 04:03:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:36.104783 | orchestrator | 2026-01-06 04:03:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:39.152543 | orchestrator | 2026-01-06 04:03:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:39.154407 | orchestrator | 2026-01-06 04:03:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:39.154453 | orchestrator | 2026-01-06 04:03:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:42.203898 | orchestrator | 2026-01-06 04:03:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:42.206361 | orchestrator | 2026-01-06 04:03:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:42.206482 | orchestrator | 2026-01-06 04:03:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:45.256356 | orchestrator | 2026-01-06 04:03:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:45.258348 | orchestrator | 2026-01-06 04:03:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:45.258434 | orchestrator | 2026-01-06 04:03:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:48.312002 | orchestrator | 2026-01-06 04:03:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:48.313884 | orchestrator | 2026-01-06 04:03:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:48.314331 | orchestrator | 2026-01-06 04:03:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:51.364681 | orchestrator | 2026-01-06 04:03:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:51.366269 | orchestrator | 2026-01-06 04:03:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:51.366313 | orchestrator | 2026-01-06 04:03:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:54.414411 | orchestrator | 2026-01-06 04:03:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:54.415869 | orchestrator | 2026-01-06 04:03:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:54.415913 | orchestrator | 2026-01-06 04:03:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:03:57.463926 | orchestrator | 2026-01-06 04:03:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:03:57.465314 | orchestrator | 2026-01-06 04:03:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:03:57.465401 | orchestrator | 2026-01-06 04:03:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:00.513566 | orchestrator | 2026-01-06 04:04:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:00.516677 | orchestrator | 2026-01-06 04:04:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:00.516743 | orchestrator | 2026-01-06 04:04:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:03.561767 | orchestrator | 2026-01-06 04:04:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:03.563990 | orchestrator | 2026-01-06 04:04:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:03.564037 | orchestrator | 2026-01-06 04:04:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:06.613156 | orchestrator | 2026-01-06 04:04:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:06.616384 | orchestrator | 2026-01-06 04:04:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:06.616508 | orchestrator | 2026-01-06 04:04:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:09.663491 | orchestrator | 2026-01-06 04:04:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:09.665278 | orchestrator | 2026-01-06 04:04:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:09.665370 | orchestrator | 2026-01-06 04:04:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:12.712055 | orchestrator | 2026-01-06 04:04:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:12.714456 | orchestrator | 2026-01-06 04:04:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:12.714517 | orchestrator | 2026-01-06 04:04:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:15.761342 | orchestrator | 2026-01-06 04:04:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:15.763233 | orchestrator | 2026-01-06 04:04:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:15.763273 | orchestrator | 2026-01-06 04:04:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:18.814989 | orchestrator | 2026-01-06 04:04:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:18.816304 | orchestrator | 2026-01-06 04:04:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:18.816352 | orchestrator | 2026-01-06 04:04:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:21.866648 | orchestrator | 2026-01-06 04:04:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:21.867771 | orchestrator | 2026-01-06 04:04:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:21.867804 | orchestrator | 2026-01-06 04:04:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:24.914594 | orchestrator | 2026-01-06 04:04:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:24.915699 | orchestrator | 2026-01-06 04:04:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:24.916051 | orchestrator | 2026-01-06 04:04:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:27.965520 | orchestrator | 2026-01-06 04:04:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:27.966728 | orchestrator | 2026-01-06 04:04:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:27.966781 | orchestrator | 2026-01-06 04:04:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:31.014354 | orchestrator | 2026-01-06 04:04:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:31.016399 | orchestrator | 2026-01-06 04:04:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:31.016503 | orchestrator | 2026-01-06 04:04:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:34.064542 | orchestrator | 2026-01-06 04:04:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:34.065372 | orchestrator | 2026-01-06 04:04:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:34.065463 | orchestrator | 2026-01-06 04:04:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:37.118588 | orchestrator | 2026-01-06 04:04:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:37.120854 | orchestrator | 2026-01-06 04:04:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:37.121046 | orchestrator | 2026-01-06 04:04:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:40.168658 | orchestrator | 2026-01-06 04:04:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:40.171376 | orchestrator | 2026-01-06 04:04:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:40.171415 | orchestrator | 2026-01-06 04:04:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:43.219165 | orchestrator | 2026-01-06 04:04:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:43.220963 | orchestrator | 2026-01-06 04:04:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:43.221053 | orchestrator | 2026-01-06 04:04:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:46.267081 | orchestrator | 2026-01-06 04:04:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:46.268395 | orchestrator | 2026-01-06 04:04:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:46.268439 | orchestrator | 2026-01-06 04:04:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:49.316037 | orchestrator | 2026-01-06 04:04:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:49.317231 | orchestrator | 2026-01-06 04:04:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:49.317270 | orchestrator | 2026-01-06 04:04:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:52.358619 | orchestrator | 2026-01-06 04:04:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:52.360622 | orchestrator | 2026-01-06 04:04:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:52.360653 | orchestrator | 2026-01-06 04:04:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:55.402388 | orchestrator | 2026-01-06 04:04:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:55.403270 | orchestrator | 2026-01-06 04:04:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:55.403306 | orchestrator | 2026-01-06 04:04:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:04:58.452747 | orchestrator | 2026-01-06 04:04:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:04:58.455508 | orchestrator | 2026-01-06 04:04:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:04:58.455564 | orchestrator | 2026-01-06 04:04:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:01.509437 | orchestrator | 2026-01-06 04:05:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:01.510882 | orchestrator | 2026-01-06 04:05:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:01.511060 | orchestrator | 2026-01-06 04:05:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:04.563237 | orchestrator | 2026-01-06 04:05:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:04.564454 | orchestrator | 2026-01-06 04:05:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:04.564470 | orchestrator | 2026-01-06 04:05:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:07.612236 | orchestrator | 2026-01-06 04:05:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:07.615706 | orchestrator | 2026-01-06 04:05:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:07.615770 | orchestrator | 2026-01-06 04:05:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:10.666499 | orchestrator | 2026-01-06 04:05:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:10.667611 | orchestrator | 2026-01-06 04:05:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:10.667713 | orchestrator | 2026-01-06 04:05:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:13.718165 | orchestrator | 2026-01-06 04:05:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:13.719510 | orchestrator | 2026-01-06 04:05:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:13.719543 | orchestrator | 2026-01-06 04:05:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:16.764730 | orchestrator | 2026-01-06 04:05:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:16.765822 | orchestrator | 2026-01-06 04:05:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:16.765860 | orchestrator | 2026-01-06 04:05:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:19.814316 | orchestrator | 2026-01-06 04:05:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:19.815854 | orchestrator | 2026-01-06 04:05:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:19.816270 | orchestrator | 2026-01-06 04:05:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:22.868236 | orchestrator | 2026-01-06 04:05:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:22.869558 | orchestrator | 2026-01-06 04:05:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:22.869598 | orchestrator | 2026-01-06 04:05:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:25.919306 | orchestrator | 2026-01-06 04:05:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:25.920938 | orchestrator | 2026-01-06 04:05:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:25.921109 | orchestrator | 2026-01-06 04:05:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:28.968814 | orchestrator | 2026-01-06 04:05:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:28.971097 | orchestrator | 2026-01-06 04:05:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:28.971137 | orchestrator | 2026-01-06 04:05:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:32.018484 | orchestrator | 2026-01-06 04:05:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:32.020265 | orchestrator | 2026-01-06 04:05:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:32.020311 | orchestrator | 2026-01-06 04:05:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:35.067320 | orchestrator | 2026-01-06 04:05:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:35.068744 | orchestrator | 2026-01-06 04:05:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:35.069052 | orchestrator | 2026-01-06 04:05:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:38.116718 | orchestrator | 2026-01-06 04:05:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:38.118739 | orchestrator | 2026-01-06 04:05:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:38.118824 | orchestrator | 2026-01-06 04:05:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:41.164116 | orchestrator | 2026-01-06 04:05:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:41.166702 | orchestrator | 2026-01-06 04:05:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:41.166783 | orchestrator | 2026-01-06 04:05:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:44.212826 | orchestrator | 2026-01-06 04:05:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:44.215217 | orchestrator | 2026-01-06 04:05:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:44.215404 | orchestrator | 2026-01-06 04:05:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:47.262845 | orchestrator | 2026-01-06 04:05:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:47.263763 | orchestrator | 2026-01-06 04:05:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:47.263795 | orchestrator | 2026-01-06 04:05:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:50.308086 | orchestrator | 2026-01-06 04:05:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:50.309685 | orchestrator | 2026-01-06 04:05:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:50.309864 | orchestrator | 2026-01-06 04:05:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:53.356422 | orchestrator | 2026-01-06 04:05:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:53.356640 | orchestrator | 2026-01-06 04:05:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:53.356668 | orchestrator | 2026-01-06 04:05:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:56.403891 | orchestrator | 2026-01-06 04:05:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:56.405409 | orchestrator | 2026-01-06 04:05:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:56.405472 | orchestrator | 2026-01-06 04:05:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:05:59.452330 | orchestrator | 2026-01-06 04:05:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:05:59.454850 | orchestrator | 2026-01-06 04:05:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:05:59.454899 | orchestrator | 2026-01-06 04:05:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:02.497286 | orchestrator | 2026-01-06 04:06:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:02.499098 | orchestrator | 2026-01-06 04:06:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:02.499207 | orchestrator | 2026-01-06 04:06:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:05.542809 | orchestrator | 2026-01-06 04:06:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:05.544671 | orchestrator | 2026-01-06 04:06:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:05.544768 | orchestrator | 2026-01-06 04:06:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:08.596485 | orchestrator | 2026-01-06 04:06:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:08.599238 | orchestrator | 2026-01-06 04:06:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:08.599294 | orchestrator | 2026-01-06 04:06:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:11.651173 | orchestrator | 2026-01-06 04:06:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:11.652221 | orchestrator | 2026-01-06 04:06:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:11.652312 | orchestrator | 2026-01-06 04:06:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:14.703307 | orchestrator | 2026-01-06 04:06:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:14.707172 | orchestrator | 2026-01-06 04:06:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:14.707212 | orchestrator | 2026-01-06 04:06:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:17.760906 | orchestrator | 2026-01-06 04:06:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:17.761714 | orchestrator | 2026-01-06 04:06:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:17.761780 | orchestrator | 2026-01-06 04:06:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:20.811142 | orchestrator | 2026-01-06 04:06:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:20.813467 | orchestrator | 2026-01-06 04:06:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:20.813561 | orchestrator | 2026-01-06 04:06:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:23.869466 | orchestrator | 2026-01-06 04:06:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:23.871763 | orchestrator | 2026-01-06 04:06:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:23.871802 | orchestrator | 2026-01-06 04:06:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:26.924650 | orchestrator | 2026-01-06 04:06:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:26.925923 | orchestrator | 2026-01-06 04:06:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:26.926167 | orchestrator | 2026-01-06 04:06:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:29.976545 | orchestrator | 2026-01-06 04:06:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:29.979228 | orchestrator | 2026-01-06 04:06:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:29.979423 | orchestrator | 2026-01-06 04:06:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:33.031706 | orchestrator | 2026-01-06 04:06:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:33.034247 | orchestrator | 2026-01-06 04:06:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:33.034466 | orchestrator | 2026-01-06 04:06:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:36.080281 | orchestrator | 2026-01-06 04:06:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:36.083469 | orchestrator | 2026-01-06 04:06:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:36.083534 | orchestrator | 2026-01-06 04:06:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:39.134352 | orchestrator | 2026-01-06 04:06:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:39.136152 | orchestrator | 2026-01-06 04:06:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:39.136186 | orchestrator | 2026-01-06 04:06:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:42.182523 | orchestrator | 2026-01-06 04:06:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:42.184235 | orchestrator | 2026-01-06 04:06:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:42.184277 | orchestrator | 2026-01-06 04:06:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:45.227842 | orchestrator | 2026-01-06 04:06:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:45.229059 | orchestrator | 2026-01-06 04:06:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:45.229178 | orchestrator | 2026-01-06 04:06:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:48.274828 | orchestrator | 2026-01-06 04:06:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:48.275264 | orchestrator | 2026-01-06 04:06:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:48.275293 | orchestrator | 2026-01-06 04:06:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:51.312119 | orchestrator | 2026-01-06 04:06:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:51.313543 | orchestrator | 2026-01-06 04:06:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:51.313933 | orchestrator | 2026-01-06 04:06:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:54.365618 | orchestrator | 2026-01-06 04:06:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:54.367537 | orchestrator | 2026-01-06 04:06:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:54.367572 | orchestrator | 2026-01-06 04:06:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:06:57.413787 | orchestrator | 2026-01-06 04:06:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:06:57.415366 | orchestrator | 2026-01-06 04:06:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:06:57.415447 | orchestrator | 2026-01-06 04:06:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:00.466178 | orchestrator | 2026-01-06 04:07:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:00.467497 | orchestrator | 2026-01-06 04:07:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:00.467549 | orchestrator | 2026-01-06 04:07:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:03.519398 | orchestrator | 2026-01-06 04:07:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:03.520258 | orchestrator | 2026-01-06 04:07:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:03.520298 | orchestrator | 2026-01-06 04:07:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:06.567309 | orchestrator | 2026-01-06 04:07:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:06.568805 | orchestrator | 2026-01-06 04:07:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:06.568870 | orchestrator | 2026-01-06 04:07:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:09.616495 | orchestrator | 2026-01-06 04:07:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:09.618847 | orchestrator | 2026-01-06 04:07:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:09.619009 | orchestrator | 2026-01-06 04:07:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:12.664103 | orchestrator | 2026-01-06 04:07:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:12.665681 | orchestrator | 2026-01-06 04:07:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:12.665881 | orchestrator | 2026-01-06 04:07:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:15.713434 | orchestrator | 2026-01-06 04:07:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:15.715774 | orchestrator | 2026-01-06 04:07:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:15.715930 | orchestrator | 2026-01-06 04:07:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:18.760219 | orchestrator | 2026-01-06 04:07:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:18.761215 | orchestrator | 2026-01-06 04:07:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:18.761335 | orchestrator | 2026-01-06 04:07:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:21.803200 | orchestrator | 2026-01-06 04:07:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:21.804652 | orchestrator | 2026-01-06 04:07:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:21.804714 | orchestrator | 2026-01-06 04:07:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:24.856656 | orchestrator | 2026-01-06 04:07:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:24.858108 | orchestrator | 2026-01-06 04:07:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:24.858248 | orchestrator | 2026-01-06 04:07:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:27.904210 | orchestrator | 2026-01-06 04:07:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:27.906569 | orchestrator | 2026-01-06 04:07:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:27.906653 | orchestrator | 2026-01-06 04:07:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:30.960324 | orchestrator | 2026-01-06 04:07:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:30.961437 | orchestrator | 2026-01-06 04:07:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:30.961511 | orchestrator | 2026-01-06 04:07:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:34.018637 | orchestrator | 2026-01-06 04:07:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:34.019937 | orchestrator | 2026-01-06 04:07:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:34.020017 | orchestrator | 2026-01-06 04:07:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:37.071976 | orchestrator | 2026-01-06 04:07:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:37.072962 | orchestrator | 2026-01-06 04:07:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:37.073461 | orchestrator | 2026-01-06 04:07:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:40.119287 | orchestrator | 2026-01-06 04:07:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:40.121186 | orchestrator | 2026-01-06 04:07:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:40.121251 | orchestrator | 2026-01-06 04:07:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:43.169900 | orchestrator | 2026-01-06 04:07:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:43.171720 | orchestrator | 2026-01-06 04:07:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:43.171771 | orchestrator | 2026-01-06 04:07:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:46.220949 | orchestrator | 2026-01-06 04:07:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:46.221881 | orchestrator | 2026-01-06 04:07:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:46.221937 | orchestrator | 2026-01-06 04:07:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:49.270646 | orchestrator | 2026-01-06 04:07:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:49.272451 | orchestrator | 2026-01-06 04:07:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:49.272536 | orchestrator | 2026-01-06 04:07:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:52.317933 | orchestrator | 2026-01-06 04:07:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:52.319724 | orchestrator | 2026-01-06 04:07:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:52.319779 | orchestrator | 2026-01-06 04:07:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:55.361156 | orchestrator | 2026-01-06 04:07:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:55.363299 | orchestrator | 2026-01-06 04:07:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:55.363379 | orchestrator | 2026-01-06 04:07:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:07:58.412025 | orchestrator | 2026-01-06 04:07:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:07:58.414006 | orchestrator | 2026-01-06 04:07:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:07:58.414331 | orchestrator | 2026-01-06 04:07:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:01.456790 | orchestrator | 2026-01-06 04:08:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:01.459341 | orchestrator | 2026-01-06 04:08:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:01.459405 | orchestrator | 2026-01-06 04:08:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:04.510742 | orchestrator | 2026-01-06 04:08:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:04.512948 | orchestrator | 2026-01-06 04:08:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:04.512980 | orchestrator | 2026-01-06 04:08:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:07.559660 | orchestrator | 2026-01-06 04:08:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:07.561543 | orchestrator | 2026-01-06 04:08:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:07.561620 | orchestrator | 2026-01-06 04:08:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:10.605846 | orchestrator | 2026-01-06 04:08:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:10.607540 | orchestrator | 2026-01-06 04:08:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:10.607587 | orchestrator | 2026-01-06 04:08:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:13.654921 | orchestrator | 2026-01-06 04:08:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:13.657796 | orchestrator | 2026-01-06 04:08:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:13.657890 | orchestrator | 2026-01-06 04:08:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:16.707055 | orchestrator | 2026-01-06 04:08:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:16.709596 | orchestrator | 2026-01-06 04:08:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:16.709635 | orchestrator | 2026-01-06 04:08:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:19.755414 | orchestrator | 2026-01-06 04:08:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:19.757491 | orchestrator | 2026-01-06 04:08:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:19.757570 | orchestrator | 2026-01-06 04:08:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:22.809091 | orchestrator | 2026-01-06 04:08:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:22.810076 | orchestrator | 2026-01-06 04:08:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:22.810111 | orchestrator | 2026-01-06 04:08:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:25.859765 | orchestrator | 2026-01-06 04:08:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:25.861774 | orchestrator | 2026-01-06 04:08:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:25.861925 | orchestrator | 2026-01-06 04:08:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:28.912702 | orchestrator | 2026-01-06 04:08:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:28.913994 | orchestrator | 2026-01-06 04:08:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:28.914102 | orchestrator | 2026-01-06 04:08:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:31.963956 | orchestrator | 2026-01-06 04:08:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:31.965541 | orchestrator | 2026-01-06 04:08:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:31.965604 | orchestrator | 2026-01-06 04:08:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:35.021430 | orchestrator | 2026-01-06 04:08:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:35.023505 | orchestrator | 2026-01-06 04:08:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:35.023552 | orchestrator | 2026-01-06 04:08:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:38.068937 | orchestrator | 2026-01-06 04:08:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:38.071099 | orchestrator | 2026-01-06 04:08:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:38.071180 | orchestrator | 2026-01-06 04:08:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:41.115096 | orchestrator | 2026-01-06 04:08:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:41.116908 | orchestrator | 2026-01-06 04:08:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:41.117359 | orchestrator | 2026-01-06 04:08:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:44.162892 | orchestrator | 2026-01-06 04:08:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:44.165042 | orchestrator | 2026-01-06 04:08:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:44.165126 | orchestrator | 2026-01-06 04:08:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:47.216047 | orchestrator | 2026-01-06 04:08:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:47.218459 | orchestrator | 2026-01-06 04:08:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:47.218514 | orchestrator | 2026-01-06 04:08:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:50.265686 | orchestrator | 2026-01-06 04:08:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:50.266003 | orchestrator | 2026-01-06 04:08:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:50.266068 | orchestrator | 2026-01-06 04:08:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:53.314912 | orchestrator | 2026-01-06 04:08:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:53.317737 | orchestrator | 2026-01-06 04:08:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:53.317827 | orchestrator | 2026-01-06 04:08:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:56.364686 | orchestrator | 2026-01-06 04:08:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:56.366477 | orchestrator | 2026-01-06 04:08:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:56.366568 | orchestrator | 2026-01-06 04:08:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:08:59.416003 | orchestrator | 2026-01-06 04:08:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:08:59.418538 | orchestrator | 2026-01-06 04:08:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:08:59.418599 | orchestrator | 2026-01-06 04:08:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:02.469196 | orchestrator | 2026-01-06 04:09:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:02.469762 | orchestrator | 2026-01-06 04:09:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:02.469798 | orchestrator | 2026-01-06 04:09:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:05.509211 | orchestrator | 2026-01-06 04:09:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:05.510397 | orchestrator | 2026-01-06 04:09:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:05.510428 | orchestrator | 2026-01-06 04:09:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:08.556873 | orchestrator | 2026-01-06 04:09:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:08.557329 | orchestrator | 2026-01-06 04:09:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:08.557362 | orchestrator | 2026-01-06 04:09:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:11.602405 | orchestrator | 2026-01-06 04:09:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:11.605137 | orchestrator | 2026-01-06 04:09:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:11.605204 | orchestrator | 2026-01-06 04:09:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:14.654375 | orchestrator | 2026-01-06 04:09:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:14.656639 | orchestrator | 2026-01-06 04:09:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:14.656873 | orchestrator | 2026-01-06 04:09:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:17.708712 | orchestrator | 2026-01-06 04:09:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:17.710108 | orchestrator | 2026-01-06 04:09:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:17.710146 | orchestrator | 2026-01-06 04:09:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:20.756812 | orchestrator | 2026-01-06 04:09:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:20.757849 | orchestrator | 2026-01-06 04:09:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:20.757900 | orchestrator | 2026-01-06 04:09:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:23.807615 | orchestrator | 2026-01-06 04:09:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:23.809763 | orchestrator | 2026-01-06 04:09:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:23.809802 | orchestrator | 2026-01-06 04:09:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:26.855629 | orchestrator | 2026-01-06 04:09:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:26.856413 | orchestrator | 2026-01-06 04:09:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:26.856469 | orchestrator | 2026-01-06 04:09:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:29.905878 | orchestrator | 2026-01-06 04:09:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:29.907187 | orchestrator | 2026-01-06 04:09:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:29.907236 | orchestrator | 2026-01-06 04:09:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:32.955551 | orchestrator | 2026-01-06 04:09:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:32.956771 | orchestrator | 2026-01-06 04:09:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:32.956815 | orchestrator | 2026-01-06 04:09:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:35.997389 | orchestrator | 2026-01-06 04:09:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:35.999640 | orchestrator | 2026-01-06 04:09:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:35.999700 | orchestrator | 2026-01-06 04:09:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:39.044639 | orchestrator | 2026-01-06 04:09:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:39.045667 | orchestrator | 2026-01-06 04:09:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:39.046084 | orchestrator | 2026-01-06 04:09:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:42.092982 | orchestrator | 2026-01-06 04:09:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:42.093789 | orchestrator | 2026-01-06 04:09:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:42.093885 | orchestrator | 2026-01-06 04:09:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:45.141025 | orchestrator | 2026-01-06 04:09:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:45.143172 | orchestrator | 2026-01-06 04:09:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:45.143268 | orchestrator | 2026-01-06 04:09:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:48.198303 | orchestrator | 2026-01-06 04:09:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:48.201549 | orchestrator | 2026-01-06 04:09:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:48.201594 | orchestrator | 2026-01-06 04:09:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:51.249864 | orchestrator | 2026-01-06 04:09:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:51.254451 | orchestrator | 2026-01-06 04:09:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:51.254743 | orchestrator | 2026-01-06 04:09:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:54.300581 | orchestrator | 2026-01-06 04:09:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:54.303346 | orchestrator | 2026-01-06 04:09:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:54.303415 | orchestrator | 2026-01-06 04:09:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:09:57.356998 | orchestrator | 2026-01-06 04:09:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:09:57.358082 | orchestrator | 2026-01-06 04:09:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:09:57.358118 | orchestrator | 2026-01-06 04:09:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:00.405961 | orchestrator | 2026-01-06 04:10:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:00.407518 | orchestrator | 2026-01-06 04:10:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:00.407534 | orchestrator | 2026-01-06 04:10:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:03.461617 | orchestrator | 2026-01-06 04:10:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:03.463558 | orchestrator | 2026-01-06 04:10:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:03.463592 | orchestrator | 2026-01-06 04:10:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:06.510441 | orchestrator | 2026-01-06 04:10:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:06.512654 | orchestrator | 2026-01-06 04:10:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:06.512726 | orchestrator | 2026-01-06 04:10:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:09.559308 | orchestrator | 2026-01-06 04:10:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:09.560802 | orchestrator | 2026-01-06 04:10:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:09.560837 | orchestrator | 2026-01-06 04:10:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:12.602702 | orchestrator | 2026-01-06 04:10:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:12.603624 | orchestrator | 2026-01-06 04:10:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:12.603677 | orchestrator | 2026-01-06 04:10:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:15.649868 | orchestrator | 2026-01-06 04:10:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:15.651718 | orchestrator | 2026-01-06 04:10:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:15.651818 | orchestrator | 2026-01-06 04:10:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:18.703578 | orchestrator | 2026-01-06 04:10:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:18.705673 | orchestrator | 2026-01-06 04:10:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:18.705777 | orchestrator | 2026-01-06 04:10:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:21.753798 | orchestrator | 2026-01-06 04:10:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:21.755452 | orchestrator | 2026-01-06 04:10:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:21.755503 | orchestrator | 2026-01-06 04:10:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:24.799176 | orchestrator | 2026-01-06 04:10:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:24.800234 | orchestrator | 2026-01-06 04:10:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:24.800658 | orchestrator | 2026-01-06 04:10:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:27.844212 | orchestrator | 2026-01-06 04:10:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:27.846479 | orchestrator | 2026-01-06 04:10:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:27.846569 | orchestrator | 2026-01-06 04:10:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:30.895281 | orchestrator | 2026-01-06 04:10:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:30.896271 | orchestrator | 2026-01-06 04:10:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:30.896301 | orchestrator | 2026-01-06 04:10:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:33.944960 | orchestrator | 2026-01-06 04:10:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:33.946489 | orchestrator | 2026-01-06 04:10:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:33.946523 | orchestrator | 2026-01-06 04:10:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:36.995455 | orchestrator | 2026-01-06 04:10:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:36.998187 | orchestrator | 2026-01-06 04:10:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:36.998384 | orchestrator | 2026-01-06 04:10:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:40.040066 | orchestrator | 2026-01-06 04:10:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:40.041681 | orchestrator | 2026-01-06 04:10:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:40.041866 | orchestrator | 2026-01-06 04:10:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:43.094956 | orchestrator | 2026-01-06 04:10:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:43.095806 | orchestrator | 2026-01-06 04:10:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:43.095835 | orchestrator | 2026-01-06 04:10:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:46.143626 | orchestrator | 2026-01-06 04:10:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:46.147872 | orchestrator | 2026-01-06 04:10:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:46.148114 | orchestrator | 2026-01-06 04:10:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:49.202077 | orchestrator | 2026-01-06 04:10:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:49.204121 | orchestrator | 2026-01-06 04:10:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:49.204204 | orchestrator | 2026-01-06 04:10:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:52.260514 | orchestrator | 2026-01-06 04:10:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:52.262586 | orchestrator | 2026-01-06 04:10:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:52.262765 | orchestrator | 2026-01-06 04:10:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:55.317300 | orchestrator | 2026-01-06 04:10:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:55.320168 | orchestrator | 2026-01-06 04:10:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:55.320261 | orchestrator | 2026-01-06 04:10:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:10:58.367780 | orchestrator | 2026-01-06 04:10:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:10:58.370983 | orchestrator | 2026-01-06 04:10:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:10:58.371075 | orchestrator | 2026-01-06 04:10:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:01.418435 | orchestrator | 2026-01-06 04:11:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:01.420895 | orchestrator | 2026-01-06 04:11:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:01.420927 | orchestrator | 2026-01-06 04:11:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:04.470764 | orchestrator | 2026-01-06 04:11:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:04.472227 | orchestrator | 2026-01-06 04:11:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:04.472282 | orchestrator | 2026-01-06 04:11:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:07.528607 | orchestrator | 2026-01-06 04:11:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:07.530956 | orchestrator | 2026-01-06 04:11:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:07.531006 | orchestrator | 2026-01-06 04:11:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:10.579875 | orchestrator | 2026-01-06 04:11:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:10.582478 | orchestrator | 2026-01-06 04:11:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:10.582584 | orchestrator | 2026-01-06 04:11:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:13.631138 | orchestrator | 2026-01-06 04:11:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:13.633351 | orchestrator | 2026-01-06 04:11:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:13.633496 | orchestrator | 2026-01-06 04:11:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:16.676897 | orchestrator | 2026-01-06 04:11:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:16.678273 | orchestrator | 2026-01-06 04:11:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:16.678327 | orchestrator | 2026-01-06 04:11:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:19.719209 | orchestrator | 2026-01-06 04:11:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:19.721761 | orchestrator | 2026-01-06 04:11:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:19.721947 | orchestrator | 2026-01-06 04:11:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:22.763489 | orchestrator | 2026-01-06 04:11:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:22.766827 | orchestrator | 2026-01-06 04:11:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:22.766983 | orchestrator | 2026-01-06 04:11:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:25.813268 | orchestrator | 2026-01-06 04:11:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:25.815516 | orchestrator | 2026-01-06 04:11:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:25.815557 | orchestrator | 2026-01-06 04:11:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:28.865142 | orchestrator | 2026-01-06 04:11:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:28.867393 | orchestrator | 2026-01-06 04:11:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:28.867532 | orchestrator | 2026-01-06 04:11:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:31.912114 | orchestrator | 2026-01-06 04:11:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:31.914411 | orchestrator | 2026-01-06 04:11:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:31.914474 | orchestrator | 2026-01-06 04:11:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:34.964057 | orchestrator | 2026-01-06 04:11:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:34.965636 | orchestrator | 2026-01-06 04:11:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:34.965764 | orchestrator | 2026-01-06 04:11:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:38.007498 | orchestrator | 2026-01-06 04:11:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:38.008722 | orchestrator | 2026-01-06 04:11:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:38.008764 | orchestrator | 2026-01-06 04:11:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:41.048698 | orchestrator | 2026-01-06 04:11:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:41.051040 | orchestrator | 2026-01-06 04:11:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:41.051208 | orchestrator | 2026-01-06 04:11:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:44.090750 | orchestrator | 2026-01-06 04:11:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:44.092617 | orchestrator | 2026-01-06 04:11:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:44.092965 | orchestrator | 2026-01-06 04:11:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:47.137346 | orchestrator | 2026-01-06 04:11:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:47.138357 | orchestrator | 2026-01-06 04:11:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:47.138572 | orchestrator | 2026-01-06 04:11:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:50.182369 | orchestrator | 2026-01-06 04:11:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:50.185533 | orchestrator | 2026-01-06 04:11:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:50.185602 | orchestrator | 2026-01-06 04:11:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:53.240111 | orchestrator | 2026-01-06 04:11:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:53.243044 | orchestrator | 2026-01-06 04:11:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:53.243102 | orchestrator | 2026-01-06 04:11:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:56.288213 | orchestrator | 2026-01-06 04:11:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:56.289558 | orchestrator | 2026-01-06 04:11:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:56.289647 | orchestrator | 2026-01-06 04:11:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:11:59.338962 | orchestrator | 2026-01-06 04:11:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:11:59.340217 | orchestrator | 2026-01-06 04:11:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:11:59.340255 | orchestrator | 2026-01-06 04:11:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:02.386419 | orchestrator | 2026-01-06 04:12:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:02.388191 | orchestrator | 2026-01-06 04:12:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:02.388255 | orchestrator | 2026-01-06 04:12:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:05.436437 | orchestrator | 2026-01-06 04:12:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:05.437522 | orchestrator | 2026-01-06 04:12:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:05.437559 | orchestrator | 2026-01-06 04:12:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:08.477584 | orchestrator | 2026-01-06 04:12:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:08.479934 | orchestrator | 2026-01-06 04:12:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:08.480033 | orchestrator | 2026-01-06 04:12:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:11.525319 | orchestrator | 2026-01-06 04:12:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:11.526943 | orchestrator | 2026-01-06 04:12:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:11.527008 | orchestrator | 2026-01-06 04:12:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:14.574611 | orchestrator | 2026-01-06 04:12:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:14.576421 | orchestrator | 2026-01-06 04:12:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:14.576526 | orchestrator | 2026-01-06 04:12:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:17.622272 | orchestrator | 2026-01-06 04:12:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:17.624154 | orchestrator | 2026-01-06 04:12:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:17.624194 | orchestrator | 2026-01-06 04:12:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:20.679390 | orchestrator | 2026-01-06 04:12:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:20.680986 | orchestrator | 2026-01-06 04:12:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:20.681020 | orchestrator | 2026-01-06 04:12:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:23.726679 | orchestrator | 2026-01-06 04:12:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:23.731636 | orchestrator | 2026-01-06 04:12:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:23.731715 | orchestrator | 2026-01-06 04:12:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:26.780786 | orchestrator | 2026-01-06 04:12:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:26.782194 | orchestrator | 2026-01-06 04:12:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:26.782232 | orchestrator | 2026-01-06 04:12:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:29.826226 | orchestrator | 2026-01-06 04:12:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:29.827704 | orchestrator | 2026-01-06 04:12:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:29.827756 | orchestrator | 2026-01-06 04:12:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:32.877327 | orchestrator | 2026-01-06 04:12:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:32.879321 | orchestrator | 2026-01-06 04:12:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:32.879393 | orchestrator | 2026-01-06 04:12:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:35.924190 | orchestrator | 2026-01-06 04:12:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:35.925317 | orchestrator | 2026-01-06 04:12:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:35.925534 | orchestrator | 2026-01-06 04:12:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:38.970866 | orchestrator | 2026-01-06 04:12:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:38.972303 | orchestrator | 2026-01-06 04:12:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:38.972353 | orchestrator | 2026-01-06 04:12:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:42.023116 | orchestrator | 2026-01-06 04:12:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:42.023360 | orchestrator | 2026-01-06 04:12:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:42.023382 | orchestrator | 2026-01-06 04:12:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:45.072203 | orchestrator | 2026-01-06 04:12:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:45.073171 | orchestrator | 2026-01-06 04:12:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:45.073230 | orchestrator | 2026-01-06 04:12:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:48.119607 | orchestrator | 2026-01-06 04:12:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:48.120275 | orchestrator | 2026-01-06 04:12:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:48.120313 | orchestrator | 2026-01-06 04:12:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:51.168388 | orchestrator | 2026-01-06 04:12:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:51.170606 | orchestrator | 2026-01-06 04:12:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:51.170712 | orchestrator | 2026-01-06 04:12:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:54.220847 | orchestrator | 2026-01-06 04:12:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:54.222311 | orchestrator | 2026-01-06 04:12:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:54.222359 | orchestrator | 2026-01-06 04:12:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:12:57.269549 | orchestrator | 2026-01-06 04:12:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:12:57.271342 | orchestrator | 2026-01-06 04:12:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:12:57.271651 | orchestrator | 2026-01-06 04:12:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:00.316984 | orchestrator | 2026-01-06 04:13:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:00.319547 | orchestrator | 2026-01-06 04:13:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:00.319988 | orchestrator | 2026-01-06 04:13:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:03.368326 | orchestrator | 2026-01-06 04:13:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:03.369291 | orchestrator | 2026-01-06 04:13:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:03.369339 | orchestrator | 2026-01-06 04:13:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:06.417941 | orchestrator | 2026-01-06 04:13:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:06.419589 | orchestrator | 2026-01-06 04:13:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:06.419727 | orchestrator | 2026-01-06 04:13:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:09.467014 | orchestrator | 2026-01-06 04:13:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:09.469233 | orchestrator | 2026-01-06 04:13:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:09.469270 | orchestrator | 2026-01-06 04:13:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:12.520840 | orchestrator | 2026-01-06 04:13:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:12.522289 | orchestrator | 2026-01-06 04:13:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:12.522328 | orchestrator | 2026-01-06 04:13:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:15.568937 | orchestrator | 2026-01-06 04:13:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:15.570720 | orchestrator | 2026-01-06 04:13:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:15.570755 | orchestrator | 2026-01-06 04:13:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:18.621166 | orchestrator | 2026-01-06 04:13:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:18.622692 | orchestrator | 2026-01-06 04:13:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:18.622773 | orchestrator | 2026-01-06 04:13:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:21.665978 | orchestrator | 2026-01-06 04:13:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:21.666949 | orchestrator | 2026-01-06 04:13:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:21.667037 | orchestrator | 2026-01-06 04:13:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:24.711964 | orchestrator | 2026-01-06 04:13:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:24.714112 | orchestrator | 2026-01-06 04:13:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:24.714150 | orchestrator | 2026-01-06 04:13:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:27.762384 | orchestrator | 2026-01-06 04:13:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:27.764837 | orchestrator | 2026-01-06 04:13:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:27.764909 | orchestrator | 2026-01-06 04:13:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:30.810410 | orchestrator | 2026-01-06 04:13:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:30.811825 | orchestrator | 2026-01-06 04:13:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:30.811863 | orchestrator | 2026-01-06 04:13:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:33.859301 | orchestrator | 2026-01-06 04:13:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:33.860184 | orchestrator | 2026-01-06 04:13:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:33.860220 | orchestrator | 2026-01-06 04:13:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:36.913476 | orchestrator | 2026-01-06 04:13:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:36.915686 | orchestrator | 2026-01-06 04:13:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:36.916037 | orchestrator | 2026-01-06 04:13:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:39.968227 | orchestrator | 2026-01-06 04:13:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:39.969812 | orchestrator | 2026-01-06 04:13:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:39.969861 | orchestrator | 2026-01-06 04:13:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:43.021007 | orchestrator | 2026-01-06 04:13:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:43.022329 | orchestrator | 2026-01-06 04:13:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:43.022370 | orchestrator | 2026-01-06 04:13:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:46.079983 | orchestrator | 2026-01-06 04:13:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:46.080298 | orchestrator | 2026-01-06 04:13:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:46.080333 | orchestrator | 2026-01-06 04:13:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:49.123153 | orchestrator | 2026-01-06 04:13:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:49.126065 | orchestrator | 2026-01-06 04:13:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:49.126164 | orchestrator | 2026-01-06 04:13:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:52.177759 | orchestrator | 2026-01-06 04:13:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:52.181464 | orchestrator | 2026-01-06 04:13:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:52.181533 | orchestrator | 2026-01-06 04:13:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:55.231908 | orchestrator | 2026-01-06 04:13:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:55.233479 | orchestrator | 2026-01-06 04:13:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:55.233659 | orchestrator | 2026-01-06 04:13:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:13:58.279773 | orchestrator | 2026-01-06 04:13:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:13:58.280841 | orchestrator | 2026-01-06 04:13:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:13:58.280918 | orchestrator | 2026-01-06 04:13:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:01.324927 | orchestrator | 2026-01-06 04:14:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:01.326344 | orchestrator | 2026-01-06 04:14:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:01.326372 | orchestrator | 2026-01-06 04:14:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:04.374753 | orchestrator | 2026-01-06 04:14:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:04.376828 | orchestrator | 2026-01-06 04:14:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:04.376979 | orchestrator | 2026-01-06 04:14:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:07.428086 | orchestrator | 2026-01-06 04:14:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:07.429532 | orchestrator | 2026-01-06 04:14:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:07.430165 | orchestrator | 2026-01-06 04:14:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:10.471778 | orchestrator | 2026-01-06 04:14:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:10.474195 | orchestrator | 2026-01-06 04:14:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:10.474817 | orchestrator | 2026-01-06 04:14:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:13.521509 | orchestrator | 2026-01-06 04:14:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:13.523955 | orchestrator | 2026-01-06 04:14:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:13.524084 | orchestrator | 2026-01-06 04:14:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:16.571406 | orchestrator | 2026-01-06 04:14:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:16.572991 | orchestrator | 2026-01-06 04:14:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:16.573045 | orchestrator | 2026-01-06 04:14:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:19.620959 | orchestrator | 2026-01-06 04:14:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:19.622857 | orchestrator | 2026-01-06 04:14:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:19.623008 | orchestrator | 2026-01-06 04:14:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:22.676697 | orchestrator | 2026-01-06 04:14:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:22.678817 | orchestrator | 2026-01-06 04:14:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:22.678945 | orchestrator | 2026-01-06 04:14:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:25.729968 | orchestrator | 2026-01-06 04:14:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:25.732176 | orchestrator | 2026-01-06 04:14:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:25.732661 | orchestrator | 2026-01-06 04:14:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:28.777755 | orchestrator | 2026-01-06 04:14:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:28.779694 | orchestrator | 2026-01-06 04:14:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:28.780189 | orchestrator | 2026-01-06 04:14:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:31.826333 | orchestrator | 2026-01-06 04:14:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:31.828567 | orchestrator | 2026-01-06 04:14:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:31.828738 | orchestrator | 2026-01-06 04:14:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:34.872185 | orchestrator | 2026-01-06 04:14:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:34.873499 | orchestrator | 2026-01-06 04:14:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:34.873538 | orchestrator | 2026-01-06 04:14:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:37.920794 | orchestrator | 2026-01-06 04:14:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:37.922667 | orchestrator | 2026-01-06 04:14:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:37.922696 | orchestrator | 2026-01-06 04:14:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:40.962003 | orchestrator | 2026-01-06 04:14:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:40.963341 | orchestrator | 2026-01-06 04:14:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:40.963382 | orchestrator | 2026-01-06 04:14:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:44.022651 | orchestrator | 2026-01-06 04:14:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:44.022751 | orchestrator | 2026-01-06 04:14:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:44.022764 | orchestrator | 2026-01-06 04:14:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:47.072376 | orchestrator | 2026-01-06 04:14:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:47.073896 | orchestrator | 2026-01-06 04:14:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:47.073935 | orchestrator | 2026-01-06 04:14:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:50.121167 | orchestrator | 2026-01-06 04:14:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:50.121819 | orchestrator | 2026-01-06 04:14:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:50.121855 | orchestrator | 2026-01-06 04:14:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:53.170387 | orchestrator | 2026-01-06 04:14:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:53.173887 | orchestrator | 2026-01-06 04:14:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:53.173951 | orchestrator | 2026-01-06 04:14:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:56.218600 | orchestrator | 2026-01-06 04:14:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:56.220336 | orchestrator | 2026-01-06 04:14:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:56.220378 | orchestrator | 2026-01-06 04:14:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:14:59.269597 | orchestrator | 2026-01-06 04:14:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:14:59.271375 | orchestrator | 2026-01-06 04:14:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:14:59.271413 | orchestrator | 2026-01-06 04:14:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:02.319208 | orchestrator | 2026-01-06 04:15:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:02.322404 | orchestrator | 2026-01-06 04:15:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:02.322464 | orchestrator | 2026-01-06 04:15:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:05.367711 | orchestrator | 2026-01-06 04:15:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:05.369245 | orchestrator | 2026-01-06 04:15:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:05.369299 | orchestrator | 2026-01-06 04:15:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:08.418370 | orchestrator | 2026-01-06 04:15:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:08.420336 | orchestrator | 2026-01-06 04:15:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:08.420450 | orchestrator | 2026-01-06 04:15:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:11.467265 | orchestrator | 2026-01-06 04:15:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:11.468772 | orchestrator | 2026-01-06 04:15:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:11.468815 | orchestrator | 2026-01-06 04:15:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:14.520801 | orchestrator | 2026-01-06 04:15:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:14.522734 | orchestrator | 2026-01-06 04:15:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:14.522861 | orchestrator | 2026-01-06 04:15:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:17.576331 | orchestrator | 2026-01-06 04:15:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:17.578229 | orchestrator | 2026-01-06 04:15:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:17.578266 | orchestrator | 2026-01-06 04:15:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:20.626780 | orchestrator | 2026-01-06 04:15:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:20.628948 | orchestrator | 2026-01-06 04:15:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:20.628981 | orchestrator | 2026-01-06 04:15:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:23.678831 | orchestrator | 2026-01-06 04:15:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:23.681288 | orchestrator | 2026-01-06 04:15:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:23.681440 | orchestrator | 2026-01-06 04:15:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:26.732749 | orchestrator | 2026-01-06 04:15:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:26.734352 | orchestrator | 2026-01-06 04:15:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:26.734550 | orchestrator | 2026-01-06 04:15:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:29.779830 | orchestrator | 2026-01-06 04:15:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:29.781033 | orchestrator | 2026-01-06 04:15:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:29.781078 | orchestrator | 2026-01-06 04:15:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:32.831564 | orchestrator | 2026-01-06 04:15:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:32.834477 | orchestrator | 2026-01-06 04:15:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:32.834495 | orchestrator | 2026-01-06 04:15:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:35.884785 | orchestrator | 2026-01-06 04:15:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:35.885456 | orchestrator | 2026-01-06 04:15:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:35.885611 | orchestrator | 2026-01-06 04:15:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:38.935108 | orchestrator | 2026-01-06 04:15:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:38.936997 | orchestrator | 2026-01-06 04:15:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:38.937078 | orchestrator | 2026-01-06 04:15:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:41.984484 | orchestrator | 2026-01-06 04:15:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:41.985727 | orchestrator | 2026-01-06 04:15:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:41.985765 | orchestrator | 2026-01-06 04:15:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:45.044647 | orchestrator | 2026-01-06 04:15:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:45.047584 | orchestrator | 2026-01-06 04:15:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:45.047626 | orchestrator | 2026-01-06 04:15:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:48.097260 | orchestrator | 2026-01-06 04:15:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:48.099313 | orchestrator | 2026-01-06 04:15:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:48.099442 | orchestrator | 2026-01-06 04:15:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:51.141120 | orchestrator | 2026-01-06 04:15:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:51.142401 | orchestrator | 2026-01-06 04:15:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:51.142437 | orchestrator | 2026-01-06 04:15:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:54.195292 | orchestrator | 2026-01-06 04:15:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:54.196048 | orchestrator | 2026-01-06 04:15:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:54.196085 | orchestrator | 2026-01-06 04:15:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:15:57.239606 | orchestrator | 2026-01-06 04:15:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:15:57.240141 | orchestrator | 2026-01-06 04:15:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:15:57.240183 | orchestrator | 2026-01-06 04:15:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:00.282214 | orchestrator | 2026-01-06 04:16:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:00.283970 | orchestrator | 2026-01-06 04:16:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:00.284052 | orchestrator | 2026-01-06 04:16:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:03.330322 | orchestrator | 2026-01-06 04:16:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:03.332095 | orchestrator | 2026-01-06 04:16:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:03.332249 | orchestrator | 2026-01-06 04:16:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:06.383222 | orchestrator | 2026-01-06 04:16:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:06.384827 | orchestrator | 2026-01-06 04:16:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:06.384878 | orchestrator | 2026-01-06 04:16:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:09.437356 | orchestrator | 2026-01-06 04:16:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:09.439303 | orchestrator | 2026-01-06 04:16:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:09.439347 | orchestrator | 2026-01-06 04:16:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:12.485519 | orchestrator | 2026-01-06 04:16:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:12.487880 | orchestrator | 2026-01-06 04:16:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:12.488636 | orchestrator | 2026-01-06 04:16:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:15.540621 | orchestrator | 2026-01-06 04:16:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:15.542093 | orchestrator | 2026-01-06 04:16:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:15.542328 | orchestrator | 2026-01-06 04:16:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:18.588191 | orchestrator | 2026-01-06 04:16:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:18.589346 | orchestrator | 2026-01-06 04:16:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:18.590210 | orchestrator | 2026-01-06 04:16:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:21.639743 | orchestrator | 2026-01-06 04:16:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:21.640586 | orchestrator | 2026-01-06 04:16:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:21.640988 | orchestrator | 2026-01-06 04:16:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:24.694977 | orchestrator | 2026-01-06 04:16:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:24.696474 | orchestrator | 2026-01-06 04:16:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:24.696513 | orchestrator | 2026-01-06 04:16:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:27.740803 | orchestrator | 2026-01-06 04:16:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:27.742736 | orchestrator | 2026-01-06 04:16:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:27.742792 | orchestrator | 2026-01-06 04:16:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:30.791465 | orchestrator | 2026-01-06 04:16:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:30.793014 | orchestrator | 2026-01-06 04:16:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:30.793192 | orchestrator | 2026-01-06 04:16:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:33.840791 | orchestrator | 2026-01-06 04:16:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:33.842935 | orchestrator | 2026-01-06 04:16:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:33.843116 | orchestrator | 2026-01-06 04:16:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:36.892640 | orchestrator | 2026-01-06 04:16:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:36.894315 | orchestrator | 2026-01-06 04:16:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:36.894359 | orchestrator | 2026-01-06 04:16:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:39.947791 | orchestrator | 2026-01-06 04:16:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:39.949432 | orchestrator | 2026-01-06 04:16:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:39.949473 | orchestrator | 2026-01-06 04:16:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:42.999368 | orchestrator | 2026-01-06 04:16:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:43.001884 | orchestrator | 2026-01-06 04:16:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:43.002160 | orchestrator | 2026-01-06 04:16:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:46.055260 | orchestrator | 2026-01-06 04:16:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:46.057478 | orchestrator | 2026-01-06 04:16:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:46.057528 | orchestrator | 2026-01-06 04:16:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:49.105550 | orchestrator | 2026-01-06 04:16:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:49.108164 | orchestrator | 2026-01-06 04:16:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:49.108969 | orchestrator | 2026-01-06 04:16:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:52.169490 | orchestrator | 2026-01-06 04:16:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:52.170748 | orchestrator | 2026-01-06 04:16:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:52.170940 | orchestrator | 2026-01-06 04:16:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:55.216764 | orchestrator | 2026-01-06 04:16:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:55.218237 | orchestrator | 2026-01-06 04:16:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:55.218323 | orchestrator | 2026-01-06 04:16:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:16:58.265795 | orchestrator | 2026-01-06 04:16:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:16:58.267395 | orchestrator | 2026-01-06 04:16:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:16:58.267437 | orchestrator | 2026-01-06 04:16:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:01.311293 | orchestrator | 2026-01-06 04:17:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:01.313244 | orchestrator | 2026-01-06 04:17:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:01.313303 | orchestrator | 2026-01-06 04:17:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:04.360969 | orchestrator | 2026-01-06 04:17:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:04.362381 | orchestrator | 2026-01-06 04:17:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:04.362681 | orchestrator | 2026-01-06 04:17:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:07.415160 | orchestrator | 2026-01-06 04:17:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:07.417473 | orchestrator | 2026-01-06 04:17:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:07.417511 | orchestrator | 2026-01-06 04:17:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:10.468486 | orchestrator | 2026-01-06 04:17:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:10.470492 | orchestrator | 2026-01-06 04:17:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:10.470541 | orchestrator | 2026-01-06 04:17:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:13.514648 | orchestrator | 2026-01-06 04:17:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:13.516138 | orchestrator | 2026-01-06 04:17:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:13.516216 | orchestrator | 2026-01-06 04:17:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:16.563325 | orchestrator | 2026-01-06 04:17:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:16.564952 | orchestrator | 2026-01-06 04:17:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:16.564979 | orchestrator | 2026-01-06 04:17:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:19.605390 | orchestrator | 2026-01-06 04:17:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:19.606703 | orchestrator | 2026-01-06 04:17:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:19.606810 | orchestrator | 2026-01-06 04:17:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:22.655987 | orchestrator | 2026-01-06 04:17:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:22.657978 | orchestrator | 2026-01-06 04:17:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:22.658099 | orchestrator | 2026-01-06 04:17:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:25.712166 | orchestrator | 2026-01-06 04:17:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:25.713934 | orchestrator | 2026-01-06 04:17:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:25.714207 | orchestrator | 2026-01-06 04:17:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:28.760379 | orchestrator | 2026-01-06 04:17:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:28.761957 | orchestrator | 2026-01-06 04:17:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:28.762174 | orchestrator | 2026-01-06 04:17:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:31.814397 | orchestrator | 2026-01-06 04:17:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:31.816037 | orchestrator | 2026-01-06 04:17:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:31.816094 | orchestrator | 2026-01-06 04:17:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:34.867515 | orchestrator | 2026-01-06 04:17:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:34.870793 | orchestrator | 2026-01-06 04:17:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:34.871288 | orchestrator | 2026-01-06 04:17:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:37.919172 | orchestrator | 2026-01-06 04:17:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:37.920179 | orchestrator | 2026-01-06 04:17:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:37.920203 | orchestrator | 2026-01-06 04:17:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:40.972155 | orchestrator | 2026-01-06 04:17:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:40.974205 | orchestrator | 2026-01-06 04:17:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:40.974350 | orchestrator | 2026-01-06 04:17:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:44.025873 | orchestrator | 2026-01-06 04:17:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:44.026770 | orchestrator | 2026-01-06 04:17:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:44.027183 | orchestrator | 2026-01-06 04:17:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:47.074237 | orchestrator | 2026-01-06 04:17:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:47.075424 | orchestrator | 2026-01-06 04:17:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:47.075448 | orchestrator | 2026-01-06 04:17:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:50.119540 | orchestrator | 2026-01-06 04:17:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:50.122198 | orchestrator | 2026-01-06 04:17:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:50.122272 | orchestrator | 2026-01-06 04:17:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:53.174323 | orchestrator | 2026-01-06 04:17:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:53.177343 | orchestrator | 2026-01-06 04:17:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:53.177432 | orchestrator | 2026-01-06 04:17:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:56.223729 | orchestrator | 2026-01-06 04:17:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:56.225909 | orchestrator | 2026-01-06 04:17:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:56.225942 | orchestrator | 2026-01-06 04:17:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:17:59.274318 | orchestrator | 2026-01-06 04:17:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:17:59.276405 | orchestrator | 2026-01-06 04:17:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:17:59.276511 | orchestrator | 2026-01-06 04:17:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:02.321615 | orchestrator | 2026-01-06 04:18:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:02.323949 | orchestrator | 2026-01-06 04:18:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:02.324077 | orchestrator | 2026-01-06 04:18:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:05.373005 | orchestrator | 2026-01-06 04:18:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:05.374297 | orchestrator | 2026-01-06 04:18:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:05.374357 | orchestrator | 2026-01-06 04:18:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:08.425314 | orchestrator | 2026-01-06 04:18:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:08.426926 | orchestrator | 2026-01-06 04:18:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:08.426971 | orchestrator | 2026-01-06 04:18:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:11.477051 | orchestrator | 2026-01-06 04:18:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:11.478098 | orchestrator | 2026-01-06 04:18:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:11.478249 | orchestrator | 2026-01-06 04:18:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:14.520586 | orchestrator | 2026-01-06 04:18:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:14.521976 | orchestrator | 2026-01-06 04:18:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:14.522014 | orchestrator | 2026-01-06 04:18:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:17.570584 | orchestrator | 2026-01-06 04:18:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:17.571911 | orchestrator | 2026-01-06 04:18:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:17.572899 | orchestrator | 2026-01-06 04:18:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:20.622722 | orchestrator | 2026-01-06 04:18:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:20.625128 | orchestrator | 2026-01-06 04:18:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:20.625204 | orchestrator | 2026-01-06 04:18:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:23.672929 | orchestrator | 2026-01-06 04:18:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:23.674305 | orchestrator | 2026-01-06 04:18:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:23.674376 | orchestrator | 2026-01-06 04:18:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:26.727762 | orchestrator | 2026-01-06 04:18:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:26.729519 | orchestrator | 2026-01-06 04:18:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:26.729559 | orchestrator | 2026-01-06 04:18:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:29.768943 | orchestrator | 2026-01-06 04:18:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:29.770393 | orchestrator | 2026-01-06 04:18:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:29.770421 | orchestrator | 2026-01-06 04:18:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:32.817928 | orchestrator | 2026-01-06 04:18:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:32.819845 | orchestrator | 2026-01-06 04:18:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:32.820018 | orchestrator | 2026-01-06 04:18:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:35.866522 | orchestrator | 2026-01-06 04:18:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:35.868554 | orchestrator | 2026-01-06 04:18:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:35.868603 | orchestrator | 2026-01-06 04:18:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:38.909493 | orchestrator | 2026-01-06 04:18:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:38.910929 | orchestrator | 2026-01-06 04:18:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:38.910982 | orchestrator | 2026-01-06 04:18:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:41.955666 | orchestrator | 2026-01-06 04:18:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:41.958243 | orchestrator | 2026-01-06 04:18:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:41.958345 | orchestrator | 2026-01-06 04:18:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:45.017613 | orchestrator | 2026-01-06 04:18:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:45.021043 | orchestrator | 2026-01-06 04:18:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:45.021232 | orchestrator | 2026-01-06 04:18:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:48.066256 | orchestrator | 2026-01-06 04:18:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:48.068516 | orchestrator | 2026-01-06 04:18:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:48.068614 | orchestrator | 2026-01-06 04:18:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:51.117338 | orchestrator | 2026-01-06 04:18:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:51.118556 | orchestrator | 2026-01-06 04:18:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:51.118601 | orchestrator | 2026-01-06 04:18:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:54.164926 | orchestrator | 2026-01-06 04:18:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:54.166199 | orchestrator | 2026-01-06 04:18:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:54.166278 | orchestrator | 2026-01-06 04:18:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:18:57.214336 | orchestrator | 2026-01-06 04:18:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:18:57.219429 | orchestrator | 2026-01-06 04:18:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:18:57.219531 | orchestrator | 2026-01-06 04:18:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:00.269726 | orchestrator | 2026-01-06 04:19:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:00.272345 | orchestrator | 2026-01-06 04:19:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:00.272405 | orchestrator | 2026-01-06 04:19:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:03.320419 | orchestrator | 2026-01-06 04:19:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:03.321986 | orchestrator | 2026-01-06 04:19:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:03.322069 | orchestrator | 2026-01-06 04:19:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:06.366741 | orchestrator | 2026-01-06 04:19:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:06.368338 | orchestrator | 2026-01-06 04:19:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:06.368373 | orchestrator | 2026-01-06 04:19:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:09.416796 | orchestrator | 2026-01-06 04:19:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:09.418197 | orchestrator | 2026-01-06 04:19:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:09.418360 | orchestrator | 2026-01-06 04:19:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:12.467786 | orchestrator | 2026-01-06 04:19:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:12.468910 | orchestrator | 2026-01-06 04:19:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:12.468940 | orchestrator | 2026-01-06 04:19:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:15.513544 | orchestrator | 2026-01-06 04:19:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:15.514478 | orchestrator | 2026-01-06 04:19:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:15.514743 | orchestrator | 2026-01-06 04:19:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:18.559593 | orchestrator | 2026-01-06 04:19:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:18.560242 | orchestrator | 2026-01-06 04:19:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:18.560853 | orchestrator | 2026-01-06 04:19:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:21.611372 | orchestrator | 2026-01-06 04:19:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:21.612618 | orchestrator | 2026-01-06 04:19:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:21.612740 | orchestrator | 2026-01-06 04:19:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:24.658778 | orchestrator | 2026-01-06 04:19:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:24.660877 | orchestrator | 2026-01-06 04:19:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:24.661005 | orchestrator | 2026-01-06 04:19:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:27.710820 | orchestrator | 2026-01-06 04:19:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:27.713130 | orchestrator | 2026-01-06 04:19:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:27.713258 | orchestrator | 2026-01-06 04:19:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:30.761380 | orchestrator | 2026-01-06 04:19:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:30.763677 | orchestrator | 2026-01-06 04:19:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:30.763766 | orchestrator | 2026-01-06 04:19:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:33.809259 | orchestrator | 2026-01-06 04:19:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:33.811229 | orchestrator | 2026-01-06 04:19:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:33.811290 | orchestrator | 2026-01-06 04:19:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:36.854768 | orchestrator | 2026-01-06 04:19:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:36.856297 | orchestrator | 2026-01-06 04:19:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:36.856356 | orchestrator | 2026-01-06 04:19:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:39.904345 | orchestrator | 2026-01-06 04:19:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:39.905729 | orchestrator | 2026-01-06 04:19:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:39.905886 | orchestrator | 2026-01-06 04:19:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:42.953009 | orchestrator | 2026-01-06 04:19:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:42.954655 | orchestrator | 2026-01-06 04:19:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:42.954804 | orchestrator | 2026-01-06 04:19:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:45.999607 | orchestrator | 2026-01-06 04:19:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:45.999910 | orchestrator | 2026-01-06 04:19:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:45.999948 | orchestrator | 2026-01-06 04:19:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:49.043629 | orchestrator | 2026-01-06 04:19:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:49.044675 | orchestrator | 2026-01-06 04:19:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:49.044818 | orchestrator | 2026-01-06 04:19:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:52.093772 | orchestrator | 2026-01-06 04:19:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:52.095144 | orchestrator | 2026-01-06 04:19:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:52.095391 | orchestrator | 2026-01-06 04:19:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:55.143652 | orchestrator | 2026-01-06 04:19:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:55.145069 | orchestrator | 2026-01-06 04:19:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:55.145109 | orchestrator | 2026-01-06 04:19:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:19:58.190228 | orchestrator | 2026-01-06 04:19:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:19:58.192569 | orchestrator | 2026-01-06 04:19:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:19:58.192606 | orchestrator | 2026-01-06 04:19:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:01.244299 | orchestrator | 2026-01-06 04:20:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:01.246161 | orchestrator | 2026-01-06 04:20:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:01.246246 | orchestrator | 2026-01-06 04:20:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:04.289396 | orchestrator | 2026-01-06 04:20:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:04.291785 | orchestrator | 2026-01-06 04:20:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:04.291978 | orchestrator | 2026-01-06 04:20:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:07.333373 | orchestrator | 2026-01-06 04:20:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:07.335143 | orchestrator | 2026-01-06 04:20:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:07.335204 | orchestrator | 2026-01-06 04:20:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:10.382670 | orchestrator | 2026-01-06 04:20:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:10.384550 | orchestrator | 2026-01-06 04:20:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:10.384638 | orchestrator | 2026-01-06 04:20:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:13.434524 | orchestrator | 2026-01-06 04:20:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:13.437160 | orchestrator | 2026-01-06 04:20:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:13.437309 | orchestrator | 2026-01-06 04:20:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:16.483781 | orchestrator | 2026-01-06 04:20:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:16.485105 | orchestrator | 2026-01-06 04:20:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:16.485686 | orchestrator | 2026-01-06 04:20:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:19.533810 | orchestrator | 2026-01-06 04:20:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:19.536126 | orchestrator | 2026-01-06 04:20:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:19.536171 | orchestrator | 2026-01-06 04:20:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:22.585115 | orchestrator | 2026-01-06 04:20:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:22.586761 | orchestrator | 2026-01-06 04:20:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:22.587029 | orchestrator | 2026-01-06 04:20:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:25.635697 | orchestrator | 2026-01-06 04:20:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:25.636912 | orchestrator | 2026-01-06 04:20:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:25.636955 | orchestrator | 2026-01-06 04:20:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:28.676954 | orchestrator | 2026-01-06 04:20:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:28.677181 | orchestrator | 2026-01-06 04:20:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:28.677455 | orchestrator | 2026-01-06 04:20:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:31.726788 | orchestrator | 2026-01-06 04:20:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:31.729974 | orchestrator | 2026-01-06 04:20:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:31.730080 | orchestrator | 2026-01-06 04:20:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:34.772365 | orchestrator | 2026-01-06 04:20:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:34.774959 | orchestrator | 2026-01-06 04:20:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:34.775072 | orchestrator | 2026-01-06 04:20:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:37.816298 | orchestrator | 2026-01-06 04:20:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:37.819097 | orchestrator | 2026-01-06 04:20:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:37.819747 | orchestrator | 2026-01-06 04:20:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:40.866134 | orchestrator | 2026-01-06 04:20:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:40.868766 | orchestrator | 2026-01-06 04:20:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:40.868811 | orchestrator | 2026-01-06 04:20:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:43.911357 | orchestrator | 2026-01-06 04:20:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:43.913052 | orchestrator | 2026-01-06 04:20:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:43.913093 | orchestrator | 2026-01-06 04:20:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:46.960791 | orchestrator | 2026-01-06 04:20:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:46.963418 | orchestrator | 2026-01-06 04:20:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:46.963549 | orchestrator | 2026-01-06 04:20:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:50.015589 | orchestrator | 2026-01-06 04:20:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:50.017389 | orchestrator | 2026-01-06 04:20:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:50.017557 | orchestrator | 2026-01-06 04:20:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:53.058436 | orchestrator | 2026-01-06 04:20:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:53.061196 | orchestrator | 2026-01-06 04:20:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:53.061290 | orchestrator | 2026-01-06 04:20:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:56.110186 | orchestrator | 2026-01-06 04:20:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:56.112030 | orchestrator | 2026-01-06 04:20:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:56.112105 | orchestrator | 2026-01-06 04:20:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:20:59.163082 | orchestrator | 2026-01-06 04:20:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:20:59.163369 | orchestrator | 2026-01-06 04:20:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:20:59.163686 | orchestrator | 2026-01-06 04:20:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:02.215889 | orchestrator | 2026-01-06 04:21:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:02.218282 | orchestrator | 2026-01-06 04:21:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:02.218380 | orchestrator | 2026-01-06 04:21:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:05.266218 | orchestrator | 2026-01-06 04:21:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:05.267167 | orchestrator | 2026-01-06 04:21:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:05.267196 | orchestrator | 2026-01-06 04:21:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:08.312999 | orchestrator | 2026-01-06 04:21:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:08.314970 | orchestrator | 2026-01-06 04:21:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:08.315007 | orchestrator | 2026-01-06 04:21:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:11.362792 | orchestrator | 2026-01-06 04:21:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:11.367168 | orchestrator | 2026-01-06 04:21:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:11.367282 | orchestrator | 2026-01-06 04:21:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:14.414841 | orchestrator | 2026-01-06 04:21:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:14.416223 | orchestrator | 2026-01-06 04:21:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:14.416280 | orchestrator | 2026-01-06 04:21:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:17.460503 | orchestrator | 2026-01-06 04:21:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:17.462463 | orchestrator | 2026-01-06 04:21:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:17.462531 | orchestrator | 2026-01-06 04:21:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:20.517663 | orchestrator | 2026-01-06 04:21:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:20.519786 | orchestrator | 2026-01-06 04:21:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:20.519839 | orchestrator | 2026-01-06 04:21:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:23.566082 | orchestrator | 2026-01-06 04:21:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:23.567542 | orchestrator | 2026-01-06 04:21:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:23.567601 | orchestrator | 2026-01-06 04:21:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:26.611364 | orchestrator | 2026-01-06 04:21:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:26.613189 | orchestrator | 2026-01-06 04:21:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:26.613409 | orchestrator | 2026-01-06 04:21:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:29.660859 | orchestrator | 2026-01-06 04:21:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:29.662127 | orchestrator | 2026-01-06 04:21:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:29.662165 | orchestrator | 2026-01-06 04:21:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:32.714695 | orchestrator | 2026-01-06 04:21:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:32.716749 | orchestrator | 2026-01-06 04:21:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:32.716809 | orchestrator | 2026-01-06 04:21:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:35.766010 | orchestrator | 2026-01-06 04:21:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:35.768354 | orchestrator | 2026-01-06 04:21:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:35.768390 | orchestrator | 2026-01-06 04:21:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:38.815331 | orchestrator | 2026-01-06 04:21:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:38.816333 | orchestrator | 2026-01-06 04:21:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:38.816422 | orchestrator | 2026-01-06 04:21:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:41.863352 | orchestrator | 2026-01-06 04:21:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:41.865239 | orchestrator | 2026-01-06 04:21:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:41.865272 | orchestrator | 2026-01-06 04:21:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:44.910474 | orchestrator | 2026-01-06 04:21:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:44.911640 | orchestrator | 2026-01-06 04:21:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:44.911788 | orchestrator | 2026-01-06 04:21:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:47.953249 | orchestrator | 2026-01-06 04:21:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:47.954731 | orchestrator | 2026-01-06 04:21:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:47.955055 | orchestrator | 2026-01-06 04:21:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:51.005561 | orchestrator | 2026-01-06 04:21:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:51.007678 | orchestrator | 2026-01-06 04:21:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:51.007718 | orchestrator | 2026-01-06 04:21:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:54.057868 | orchestrator | 2026-01-06 04:21:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:54.058198 | orchestrator | 2026-01-06 04:21:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:54.058218 | orchestrator | 2026-01-06 04:21:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:21:57.096416 | orchestrator | 2026-01-06 04:21:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:21:57.098324 | orchestrator | 2026-01-06 04:21:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:21:57.098527 | orchestrator | 2026-01-06 04:21:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:00.151657 | orchestrator | 2026-01-06 04:22:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:00.153735 | orchestrator | 2026-01-06 04:22:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:00.153833 | orchestrator | 2026-01-06 04:22:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:03.204842 | orchestrator | 2026-01-06 04:22:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:03.206407 | orchestrator | 2026-01-06 04:22:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:03.206568 | orchestrator | 2026-01-06 04:22:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:06.254229 | orchestrator | 2026-01-06 04:22:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:06.255527 | orchestrator | 2026-01-06 04:22:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:06.255569 | orchestrator | 2026-01-06 04:22:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:09.299249 | orchestrator | 2026-01-06 04:22:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:09.301632 | orchestrator | 2026-01-06 04:22:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:09.301678 | orchestrator | 2026-01-06 04:22:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:12.348788 | orchestrator | 2026-01-06 04:22:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:12.350299 | orchestrator | 2026-01-06 04:22:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:12.350411 | orchestrator | 2026-01-06 04:22:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:15.394531 | orchestrator | 2026-01-06 04:22:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:15.396414 | orchestrator | 2026-01-06 04:22:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:15.396469 | orchestrator | 2026-01-06 04:22:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:18.440781 | orchestrator | 2026-01-06 04:22:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:18.442143 | orchestrator | 2026-01-06 04:22:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:18.442165 | orchestrator | 2026-01-06 04:22:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:21.493452 | orchestrator | 2026-01-06 04:22:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:21.495286 | orchestrator | 2026-01-06 04:22:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:21.495337 | orchestrator | 2026-01-06 04:22:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:24.541125 | orchestrator | 2026-01-06 04:22:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:24.542554 | orchestrator | 2026-01-06 04:22:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:24.542618 | orchestrator | 2026-01-06 04:22:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:27.598580 | orchestrator | 2026-01-06 04:22:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:27.600399 | orchestrator | 2026-01-06 04:22:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:27.600530 | orchestrator | 2026-01-06 04:22:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:30.647923 | orchestrator | 2026-01-06 04:22:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:30.650470 | orchestrator | 2026-01-06 04:22:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:30.650564 | orchestrator | 2026-01-06 04:22:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:33.701943 | orchestrator | 2026-01-06 04:22:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:33.703519 | orchestrator | 2026-01-06 04:22:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:33.703570 | orchestrator | 2026-01-06 04:22:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:36.756403 | orchestrator | 2026-01-06 04:22:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:36.759105 | orchestrator | 2026-01-06 04:22:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:36.759176 | orchestrator | 2026-01-06 04:22:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:39.804227 | orchestrator | 2026-01-06 04:22:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:39.805697 | orchestrator | 2026-01-06 04:22:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:39.805760 | orchestrator | 2026-01-06 04:22:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:42.853193 | orchestrator | 2026-01-06 04:22:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:42.855448 | orchestrator | 2026-01-06 04:22:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:42.855603 | orchestrator | 2026-01-06 04:22:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:45.901561 | orchestrator | 2026-01-06 04:22:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:45.905122 | orchestrator | 2026-01-06 04:22:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:45.905302 | orchestrator | 2026-01-06 04:22:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:48.951644 | orchestrator | 2026-01-06 04:22:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:48.953541 | orchestrator | 2026-01-06 04:22:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:48.953585 | orchestrator | 2026-01-06 04:22:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:52.001422 | orchestrator | 2026-01-06 04:22:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:52.002537 | orchestrator | 2026-01-06 04:22:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:52.002564 | orchestrator | 2026-01-06 04:22:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:55.045679 | orchestrator | 2026-01-06 04:22:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:55.046700 | orchestrator | 2026-01-06 04:22:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:55.046732 | orchestrator | 2026-01-06 04:22:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:22:58.090120 | orchestrator | 2026-01-06 04:22:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:22:58.091359 | orchestrator | 2026-01-06 04:22:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:22:58.091601 | orchestrator | 2026-01-06 04:22:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:01.140610 | orchestrator | 2026-01-06 04:23:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:01.142398 | orchestrator | 2026-01-06 04:23:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:01.142436 | orchestrator | 2026-01-06 04:23:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:04.191575 | orchestrator | 2026-01-06 04:23:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:04.192791 | orchestrator | 2026-01-06 04:23:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:04.192853 | orchestrator | 2026-01-06 04:23:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:07.238731 | orchestrator | 2026-01-06 04:23:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:07.242428 | orchestrator | 2026-01-06 04:23:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:07.242496 | orchestrator | 2026-01-06 04:23:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:10.296051 | orchestrator | 2026-01-06 04:23:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:10.298302 | orchestrator | 2026-01-06 04:23:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:10.298387 | orchestrator | 2026-01-06 04:23:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:13.350782 | orchestrator | 2026-01-06 04:23:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:13.351918 | orchestrator | 2026-01-06 04:23:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:13.351948 | orchestrator | 2026-01-06 04:23:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:16.400711 | orchestrator | 2026-01-06 04:23:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:16.402493 | orchestrator | 2026-01-06 04:23:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:16.402573 | orchestrator | 2026-01-06 04:23:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:19.452300 | orchestrator | 2026-01-06 04:23:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:19.454190 | orchestrator | 2026-01-06 04:23:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:19.454221 | orchestrator | 2026-01-06 04:23:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:22.498136 | orchestrator | 2026-01-06 04:23:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:22.499461 | orchestrator | 2026-01-06 04:23:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:22.499526 | orchestrator | 2026-01-06 04:23:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:25.550729 | orchestrator | 2026-01-06 04:23:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:25.552593 | orchestrator | 2026-01-06 04:23:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:25.552706 | orchestrator | 2026-01-06 04:23:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:28.602283 | orchestrator | 2026-01-06 04:23:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:28.603396 | orchestrator | 2026-01-06 04:23:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:28.603427 | orchestrator | 2026-01-06 04:23:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:31.666292 | orchestrator | 2026-01-06 04:23:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:31.667839 | orchestrator | 2026-01-06 04:23:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:31.667870 | orchestrator | 2026-01-06 04:23:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:34.713864 | orchestrator | 2026-01-06 04:23:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:34.716331 | orchestrator | 2026-01-06 04:23:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:34.716380 | orchestrator | 2026-01-06 04:23:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:37.764411 | orchestrator | 2026-01-06 04:23:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:37.765326 | orchestrator | 2026-01-06 04:23:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:37.765467 | orchestrator | 2026-01-06 04:23:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:40.813955 | orchestrator | 2026-01-06 04:23:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:40.815951 | orchestrator | 2026-01-06 04:23:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:40.815999 | orchestrator | 2026-01-06 04:23:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:43.865421 | orchestrator | 2026-01-06 04:23:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:43.866687 | orchestrator | 2026-01-06 04:23:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:43.866733 | orchestrator | 2026-01-06 04:23:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:46.916890 | orchestrator | 2026-01-06 04:23:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:46.918444 | orchestrator | 2026-01-06 04:23:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:46.918514 | orchestrator | 2026-01-06 04:23:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:49.961995 | orchestrator | 2026-01-06 04:23:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:49.963269 | orchestrator | 2026-01-06 04:23:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:49.963310 | orchestrator | 2026-01-06 04:23:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:53.010580 | orchestrator | 2026-01-06 04:23:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:53.012919 | orchestrator | 2026-01-06 04:23:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:53.013346 | orchestrator | 2026-01-06 04:23:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:56.058806 | orchestrator | 2026-01-06 04:23:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:56.059010 | orchestrator | 2026-01-06 04:23:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:56.059062 | orchestrator | 2026-01-06 04:23:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:23:59.102861 | orchestrator | 2026-01-06 04:23:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:23:59.104683 | orchestrator | 2026-01-06 04:23:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:23:59.104788 | orchestrator | 2026-01-06 04:23:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:02.156109 | orchestrator | 2026-01-06 04:24:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:02.157655 | orchestrator | 2026-01-06 04:24:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:02.157716 | orchestrator | 2026-01-06 04:24:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:05.203449 | orchestrator | 2026-01-06 04:24:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:05.204126 | orchestrator | 2026-01-06 04:24:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:05.204159 | orchestrator | 2026-01-06 04:24:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:08.251712 | orchestrator | 2026-01-06 04:24:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:08.252918 | orchestrator | 2026-01-06 04:24:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:08.252966 | orchestrator | 2026-01-06 04:24:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:11.304294 | orchestrator | 2026-01-06 04:24:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:11.305586 | orchestrator | 2026-01-06 04:24:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:11.305632 | orchestrator | 2026-01-06 04:24:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:14.353106 | orchestrator | 2026-01-06 04:24:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:14.355023 | orchestrator | 2026-01-06 04:24:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:14.355130 | orchestrator | 2026-01-06 04:24:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:17.399985 | orchestrator | 2026-01-06 04:24:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:17.401804 | orchestrator | 2026-01-06 04:24:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:17.401953 | orchestrator | 2026-01-06 04:24:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:20.443919 | orchestrator | 2026-01-06 04:24:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:20.444688 | orchestrator | 2026-01-06 04:24:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:20.444716 | orchestrator | 2026-01-06 04:24:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:23.492300 | orchestrator | 2026-01-06 04:24:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:23.492525 | orchestrator | 2026-01-06 04:24:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:23.492555 | orchestrator | 2026-01-06 04:24:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:26.543032 | orchestrator | 2026-01-06 04:24:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:26.545461 | orchestrator | 2026-01-06 04:24:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:26.545743 | orchestrator | 2026-01-06 04:24:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:29.593634 | orchestrator | 2026-01-06 04:24:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:29.595290 | orchestrator | 2026-01-06 04:24:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:29.595402 | orchestrator | 2026-01-06 04:24:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:32.643168 | orchestrator | 2026-01-06 04:24:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:32.644484 | orchestrator | 2026-01-06 04:24:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:32.644580 | orchestrator | 2026-01-06 04:24:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:35.693740 | orchestrator | 2026-01-06 04:24:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:35.695935 | orchestrator | 2026-01-06 04:24:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:35.695994 | orchestrator | 2026-01-06 04:24:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:38.742362 | orchestrator | 2026-01-06 04:24:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:38.745200 | orchestrator | 2026-01-06 04:24:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:38.745264 | orchestrator | 2026-01-06 04:24:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:41.796531 | orchestrator | 2026-01-06 04:24:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:41.798384 | orchestrator | 2026-01-06 04:24:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:41.798448 | orchestrator | 2026-01-06 04:24:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:44.844254 | orchestrator | 2026-01-06 04:24:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:44.845199 | orchestrator | 2026-01-06 04:24:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:44.845685 | orchestrator | 2026-01-06 04:24:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:47.895505 | orchestrator | 2026-01-06 04:24:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:47.896586 | orchestrator | 2026-01-06 04:24:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:47.896626 | orchestrator | 2026-01-06 04:24:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:50.943367 | orchestrator | 2026-01-06 04:24:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:50.946646 | orchestrator | 2026-01-06 04:24:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:50.946717 | orchestrator | 2026-01-06 04:24:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:53.997236 | orchestrator | 2026-01-06 04:24:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:53.999079 | orchestrator | 2026-01-06 04:24:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:53.999151 | orchestrator | 2026-01-06 04:24:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:24:57.051909 | orchestrator | 2026-01-06 04:24:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:24:57.054244 | orchestrator | 2026-01-06 04:24:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:24:57.054329 | orchestrator | 2026-01-06 04:24:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:00.101304 | orchestrator | 2026-01-06 04:25:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:00.103327 | orchestrator | 2026-01-06 04:25:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:00.103375 | orchestrator | 2026-01-06 04:25:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:03.151388 | orchestrator | 2026-01-06 04:25:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:03.153704 | orchestrator | 2026-01-06 04:25:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:03.153813 | orchestrator | 2026-01-06 04:25:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:06.197674 | orchestrator | 2026-01-06 04:25:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:06.199738 | orchestrator | 2026-01-06 04:25:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:06.200051 | orchestrator | 2026-01-06 04:25:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:09.240374 | orchestrator | 2026-01-06 04:25:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:09.242584 | orchestrator | 2026-01-06 04:25:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:09.242692 | orchestrator | 2026-01-06 04:25:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:12.287488 | orchestrator | 2026-01-06 04:25:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:12.289442 | orchestrator | 2026-01-06 04:25:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:12.289516 | orchestrator | 2026-01-06 04:25:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:15.334366 | orchestrator | 2026-01-06 04:25:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:15.336520 | orchestrator | 2026-01-06 04:25:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:15.336646 | orchestrator | 2026-01-06 04:25:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:18.382333 | orchestrator | 2026-01-06 04:25:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:18.383667 | orchestrator | 2026-01-06 04:25:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:18.383755 | orchestrator | 2026-01-06 04:25:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:21.430522 | orchestrator | 2026-01-06 04:25:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:21.431336 | orchestrator | 2026-01-06 04:25:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:21.431478 | orchestrator | 2026-01-06 04:25:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:24.480690 | orchestrator | 2026-01-06 04:25:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:24.481720 | orchestrator | 2026-01-06 04:25:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:24.481764 | orchestrator | 2026-01-06 04:25:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:27.525790 | orchestrator | 2026-01-06 04:25:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:27.528384 | orchestrator | 2026-01-06 04:25:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:27.528444 | orchestrator | 2026-01-06 04:25:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:30.577245 | orchestrator | 2026-01-06 04:25:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:30.577871 | orchestrator | 2026-01-06 04:25:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:30.577903 | orchestrator | 2026-01-06 04:25:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:33.625287 | orchestrator | 2026-01-06 04:25:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:33.628901 | orchestrator | 2026-01-06 04:25:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:33.628981 | orchestrator | 2026-01-06 04:25:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:36.679651 | orchestrator | 2026-01-06 04:25:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:36.682222 | orchestrator | 2026-01-06 04:25:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:36.682264 | orchestrator | 2026-01-06 04:25:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:39.733732 | orchestrator | 2026-01-06 04:25:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:39.735615 | orchestrator | 2026-01-06 04:25:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:39.735639 | orchestrator | 2026-01-06 04:25:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:42.786316 | orchestrator | 2026-01-06 04:25:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:42.789528 | orchestrator | 2026-01-06 04:25:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:42.789595 | orchestrator | 2026-01-06 04:25:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:45.843417 | orchestrator | 2026-01-06 04:25:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:45.846963 | orchestrator | 2026-01-06 04:25:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:45.847038 | orchestrator | 2026-01-06 04:25:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:48.899031 | orchestrator | 2026-01-06 04:25:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:48.901548 | orchestrator | 2026-01-06 04:25:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:48.901670 | orchestrator | 2026-01-06 04:25:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:51.956146 | orchestrator | 2026-01-06 04:25:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:51.959271 | orchestrator | 2026-01-06 04:25:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:51.959369 | orchestrator | 2026-01-06 04:25:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:55.005805 | orchestrator | 2026-01-06 04:25:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:55.009049 | orchestrator | 2026-01-06 04:25:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:55.009163 | orchestrator | 2026-01-06 04:25:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:25:58.058579 | orchestrator | 2026-01-06 04:25:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:25:58.060837 | orchestrator | 2026-01-06 04:25:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:25:58.061190 | orchestrator | 2026-01-06 04:25:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:01.100315 | orchestrator | 2026-01-06 04:26:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:01.102883 | orchestrator | 2026-01-06 04:26:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:01.102934 | orchestrator | 2026-01-06 04:26:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:04.149071 | orchestrator | 2026-01-06 04:26:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:04.151001 | orchestrator | 2026-01-06 04:26:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:04.151138 | orchestrator | 2026-01-06 04:26:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:07.199017 | orchestrator | 2026-01-06 04:26:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:07.200630 | orchestrator | 2026-01-06 04:26:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:07.200911 | orchestrator | 2026-01-06 04:26:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:10.245857 | orchestrator | 2026-01-06 04:26:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:10.247735 | orchestrator | 2026-01-06 04:26:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:10.247814 | orchestrator | 2026-01-06 04:26:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:13.295595 | orchestrator | 2026-01-06 04:26:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:13.298139 | orchestrator | 2026-01-06 04:26:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:13.298208 | orchestrator | 2026-01-06 04:26:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:16.340176 | orchestrator | 2026-01-06 04:26:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:16.341701 | orchestrator | 2026-01-06 04:26:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:16.341749 | orchestrator | 2026-01-06 04:26:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:19.392236 | orchestrator | 2026-01-06 04:26:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:19.394100 | orchestrator | 2026-01-06 04:26:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:19.394360 | orchestrator | 2026-01-06 04:26:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:22.435907 | orchestrator | 2026-01-06 04:26:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:22.437473 | orchestrator | 2026-01-06 04:26:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:22.437609 | orchestrator | 2026-01-06 04:26:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:25.484694 | orchestrator | 2026-01-06 04:26:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:25.486234 | orchestrator | 2026-01-06 04:26:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:25.486324 | orchestrator | 2026-01-06 04:26:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:28.535657 | orchestrator | 2026-01-06 04:26:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:28.537910 | orchestrator | 2026-01-06 04:26:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:28.537947 | orchestrator | 2026-01-06 04:26:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:31.581398 | orchestrator | 2026-01-06 04:26:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:31.583496 | orchestrator | 2026-01-06 04:26:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:31.583539 | orchestrator | 2026-01-06 04:26:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:34.628784 | orchestrator | 2026-01-06 04:26:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:34.630603 | orchestrator | 2026-01-06 04:26:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:34.630700 | orchestrator | 2026-01-06 04:26:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:37.675408 | orchestrator | 2026-01-06 04:26:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:37.677594 | orchestrator | 2026-01-06 04:26:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:37.677668 | orchestrator | 2026-01-06 04:26:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:40.726591 | orchestrator | 2026-01-06 04:26:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:40.728187 | orchestrator | 2026-01-06 04:26:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:40.728290 | orchestrator | 2026-01-06 04:26:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:43.770981 | orchestrator | 2026-01-06 04:26:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:43.773559 | orchestrator | 2026-01-06 04:26:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:43.773743 | orchestrator | 2026-01-06 04:26:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:46.821002 | orchestrator | 2026-01-06 04:26:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:46.822617 | orchestrator | 2026-01-06 04:26:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:46.822671 | orchestrator | 2026-01-06 04:26:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:49.867799 | orchestrator | 2026-01-06 04:26:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:49.869060 | orchestrator | 2026-01-06 04:26:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:49.869278 | orchestrator | 2026-01-06 04:26:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:52.918337 | orchestrator | 2026-01-06 04:26:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:52.920646 | orchestrator | 2026-01-06 04:26:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:52.920688 | orchestrator | 2026-01-06 04:26:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:55.964918 | orchestrator | 2026-01-06 04:26:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:55.967061 | orchestrator | 2026-01-06 04:26:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:55.967253 | orchestrator | 2026-01-06 04:26:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:26:59.013545 | orchestrator | 2026-01-06 04:26:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:26:59.013907 | orchestrator | 2026-01-06 04:26:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:26:59.013936 | orchestrator | 2026-01-06 04:26:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:02.059169 | orchestrator | 2026-01-06 04:27:02 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:02.060674 | orchestrator | 2026-01-06 04:27:02 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:02.061272 | orchestrator | 2026-01-06 04:27:02 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:05.100742 | orchestrator | 2026-01-06 04:27:05 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:05.102639 | orchestrator | 2026-01-06 04:27:05 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:05.102771 | orchestrator | 2026-01-06 04:27:05 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:08.149234 | orchestrator | 2026-01-06 04:27:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:08.149946 | orchestrator | 2026-01-06 04:27:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:08.150309 | orchestrator | 2026-01-06 04:27:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:11.196260 | orchestrator | 2026-01-06 04:27:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:11.197633 | orchestrator | 2026-01-06 04:27:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:11.197862 | orchestrator | 2026-01-06 04:27:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:14.248270 | orchestrator | 2026-01-06 04:27:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:14.251083 | orchestrator | 2026-01-06 04:27:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:14.251319 | orchestrator | 2026-01-06 04:27:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:17.301474 | orchestrator | 2026-01-06 04:27:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:17.303358 | orchestrator | 2026-01-06 04:27:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:17.303401 | orchestrator | 2026-01-06 04:27:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:20.352425 | orchestrator | 2026-01-06 04:27:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:20.354905 | orchestrator | 2026-01-06 04:27:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:20.354977 | orchestrator | 2026-01-06 04:27:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:23.402246 | orchestrator | 2026-01-06 04:27:23 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:23.403950 | orchestrator | 2026-01-06 04:27:23 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:23.404010 | orchestrator | 2026-01-06 04:27:23 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:26.444789 | orchestrator | 2026-01-06 04:27:26 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:26.446866 | orchestrator | 2026-01-06 04:27:26 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:26.446954 | orchestrator | 2026-01-06 04:27:26 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:29.494387 | orchestrator | 2026-01-06 04:27:29 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:29.495695 | orchestrator | 2026-01-06 04:27:29 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:29.495747 | orchestrator | 2026-01-06 04:27:29 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:32.547479 | orchestrator | 2026-01-06 04:27:32 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:32.549893 | orchestrator | 2026-01-06 04:27:32 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:32.549943 | orchestrator | 2026-01-06 04:27:32 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:35.603577 | orchestrator | 2026-01-06 04:27:35 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:35.604754 | orchestrator | 2026-01-06 04:27:35 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:35.604988 | orchestrator | 2026-01-06 04:27:35 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:38.646940 | orchestrator | 2026-01-06 04:27:38 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:38.649099 | orchestrator | 2026-01-06 04:27:38 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:38.649301 | orchestrator | 2026-01-06 04:27:38 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:41.696472 | orchestrator | 2026-01-06 04:27:41 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:41.697484 | orchestrator | 2026-01-06 04:27:41 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:41.697632 | orchestrator | 2026-01-06 04:27:41 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:44.740055 | orchestrator | 2026-01-06 04:27:44 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:44.742339 | orchestrator | 2026-01-06 04:27:44 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:44.742384 | orchestrator | 2026-01-06 04:27:44 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:47.793213 | orchestrator | 2026-01-06 04:27:47 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:47.794729 | orchestrator | 2026-01-06 04:27:47 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:47.794932 | orchestrator | 2026-01-06 04:27:47 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:50.844789 | orchestrator | 2026-01-06 04:27:50 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:50.846541 | orchestrator | 2026-01-06 04:27:50 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:50.846584 | orchestrator | 2026-01-06 04:27:50 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:53.892618 | orchestrator | 2026-01-06 04:27:53 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:53.893456 | orchestrator | 2026-01-06 04:27:53 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:53.893550 | orchestrator | 2026-01-06 04:27:53 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:56.941018 | orchestrator | 2026-01-06 04:27:56 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:56.942367 | orchestrator | 2026-01-06 04:27:56 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:56.942461 | orchestrator | 2026-01-06 04:27:56 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:27:59.987391 | orchestrator | 2026-01-06 04:27:59 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:27:59.988798 | orchestrator | 2026-01-06 04:27:59 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:27:59.988841 | orchestrator | 2026-01-06 04:27:59 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:03.034802 | orchestrator | 2026-01-06 04:28:03 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:03.037517 | orchestrator | 2026-01-06 04:28:03 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:03.037575 | orchestrator | 2026-01-06 04:28:03 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:06.093398 | orchestrator | 2026-01-06 04:28:06 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:06.094646 | orchestrator | 2026-01-06 04:28:06 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:06.094681 | orchestrator | 2026-01-06 04:28:06 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:09.147124 | orchestrator | 2026-01-06 04:28:09 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:09.148617 | orchestrator | 2026-01-06 04:28:09 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:09.148661 | orchestrator | 2026-01-06 04:28:09 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:12.195672 | orchestrator | 2026-01-06 04:28:12 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:12.197134 | orchestrator | 2026-01-06 04:28:12 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:12.197255 | orchestrator | 2026-01-06 04:28:12 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:15.240531 | orchestrator | 2026-01-06 04:28:15 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:15.241507 | orchestrator | 2026-01-06 04:28:15 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:15.241556 | orchestrator | 2026-01-06 04:28:15 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:18.288494 | orchestrator | 2026-01-06 04:28:18 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:18.289726 | orchestrator | 2026-01-06 04:28:18 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:18.289771 | orchestrator | 2026-01-06 04:28:18 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:21.338811 | orchestrator | 2026-01-06 04:28:21 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:21.341424 | orchestrator | 2026-01-06 04:28:21 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:21.341479 | orchestrator | 2026-01-06 04:28:21 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:24.388187 | orchestrator | 2026-01-06 04:28:24 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:24.390075 | orchestrator | 2026-01-06 04:28:24 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:24.390228 | orchestrator | 2026-01-06 04:28:24 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:27.434869 | orchestrator | 2026-01-06 04:28:27 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:27.436914 | orchestrator | 2026-01-06 04:28:27 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:27.437023 | orchestrator | 2026-01-06 04:28:27 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:30.473309 | orchestrator | 2026-01-06 04:28:30 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:30.473752 | orchestrator | 2026-01-06 04:28:30 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:30.473790 | orchestrator | 2026-01-06 04:28:30 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:33.517470 | orchestrator | 2026-01-06 04:28:33 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:33.520704 | orchestrator | 2026-01-06 04:28:33 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:33.520891 | orchestrator | 2026-01-06 04:28:33 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:36.567647 | orchestrator | 2026-01-06 04:28:36 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:36.569533 | orchestrator | 2026-01-06 04:28:36 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:36.569692 | orchestrator | 2026-01-06 04:28:36 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:39.615977 | orchestrator | 2026-01-06 04:28:39 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:39.616802 | orchestrator | 2026-01-06 04:28:39 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:39.616866 | orchestrator | 2026-01-06 04:28:39 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:42.662181 | orchestrator | 2026-01-06 04:28:42 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:42.663660 | orchestrator | 2026-01-06 04:28:42 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:42.663728 | orchestrator | 2026-01-06 04:28:42 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:45.705891 | orchestrator | 2026-01-06 04:28:45 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:45.706920 | orchestrator | 2026-01-06 04:28:45 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:45.707319 | orchestrator | 2026-01-06 04:28:45 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:48.754480 | orchestrator | 2026-01-06 04:28:48 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:48.756000 | orchestrator | 2026-01-06 04:28:48 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:48.756046 | orchestrator | 2026-01-06 04:28:48 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:51.802268 | orchestrator | 2026-01-06 04:28:51 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:51.803829 | orchestrator | 2026-01-06 04:28:51 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:51.803912 | orchestrator | 2026-01-06 04:28:51 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:54.849299 | orchestrator | 2026-01-06 04:28:54 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:54.851822 | orchestrator | 2026-01-06 04:28:54 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:54.851894 | orchestrator | 2026-01-06 04:28:54 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:28:57.901853 | orchestrator | 2026-01-06 04:28:57 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:28:57.903392 | orchestrator | 2026-01-06 04:28:57 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:28:57.903433 | orchestrator | 2026-01-06 04:28:57 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:00.951877 | orchestrator | 2026-01-06 04:29:00 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:00.952370 | orchestrator | 2026-01-06 04:29:00 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:00.952674 | orchestrator | 2026-01-06 04:29:00 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:04.002212 | orchestrator | 2026-01-06 04:29:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:04.027065 | orchestrator | 2026-01-06 04:29:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:04.027193 | orchestrator | 2026-01-06 04:29:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:07.058786 | orchestrator | 2026-01-06 04:29:07 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:07.060751 | orchestrator | 2026-01-06 04:29:07 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:07.060818 | orchestrator | 2026-01-06 04:29:07 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:10.109134 | orchestrator | 2026-01-06 04:29:10 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:10.111051 | orchestrator | 2026-01-06 04:29:10 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:10.111161 | orchestrator | 2026-01-06 04:29:10 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:13.158644 | orchestrator | 2026-01-06 04:29:13 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:13.159966 | orchestrator | 2026-01-06 04:29:13 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:13.160018 | orchestrator | 2026-01-06 04:29:13 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:16.207707 | orchestrator | 2026-01-06 04:29:16 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:16.209283 | orchestrator | 2026-01-06 04:29:16 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:16.209349 | orchestrator | 2026-01-06 04:29:16 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:19.253192 | orchestrator | 2026-01-06 04:29:19 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:19.255002 | orchestrator | 2026-01-06 04:29:19 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:19.255082 | orchestrator | 2026-01-06 04:29:19 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:22.307818 | orchestrator | 2026-01-06 04:29:22 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:22.313104 | orchestrator | 2026-01-06 04:29:22 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:22.313186 | orchestrator | 2026-01-06 04:29:22 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:25.354600 | orchestrator | 2026-01-06 04:29:25 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:25.356188 | orchestrator | 2026-01-06 04:29:25 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:25.356206 | orchestrator | 2026-01-06 04:29:25 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:28.399358 | orchestrator | 2026-01-06 04:29:28 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:28.402166 | orchestrator | 2026-01-06 04:29:28 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:28.402257 | orchestrator | 2026-01-06 04:29:28 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:31.446907 | orchestrator | 2026-01-06 04:29:31 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:31.447978 | orchestrator | 2026-01-06 04:29:31 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:31.448300 | orchestrator | 2026-01-06 04:29:31 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:34.493731 | orchestrator | 2026-01-06 04:29:34 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:34.494510 | orchestrator | 2026-01-06 04:29:34 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:34.494688 | orchestrator | 2026-01-06 04:29:34 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:37.539514 | orchestrator | 2026-01-06 04:29:37 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:37.541065 | orchestrator | 2026-01-06 04:29:37 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:37.541115 | orchestrator | 2026-01-06 04:29:37 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:40.587669 | orchestrator | 2026-01-06 04:29:40 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:40.589082 | orchestrator | 2026-01-06 04:29:40 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:40.589128 | orchestrator | 2026-01-06 04:29:40 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:43.628302 | orchestrator | 2026-01-06 04:29:43 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:43.629263 | orchestrator | 2026-01-06 04:29:43 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:43.629962 | orchestrator | 2026-01-06 04:29:43 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:46.672791 | orchestrator | 2026-01-06 04:29:46 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:46.675017 | orchestrator | 2026-01-06 04:29:46 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:46.675057 | orchestrator | 2026-01-06 04:29:46 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:49.723344 | orchestrator | 2026-01-06 04:29:49 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:49.725443 | orchestrator | 2026-01-06 04:29:49 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:49.726127 | orchestrator | 2026-01-06 04:29:49 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:52.772004 | orchestrator | 2026-01-06 04:29:52 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:52.772711 | orchestrator | 2026-01-06 04:29:52 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:52.772741 | orchestrator | 2026-01-06 04:29:52 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:55.819213 | orchestrator | 2026-01-06 04:29:55 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:55.821503 | orchestrator | 2026-01-06 04:29:55 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:55.821592 | orchestrator | 2026-01-06 04:29:55 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:29:58.868317 | orchestrator | 2026-01-06 04:29:58 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:29:58.870494 | orchestrator | 2026-01-06 04:29:58 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:29:58.870759 | orchestrator | 2026-01-06 04:29:58 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:01.910222 | orchestrator | 2026-01-06 04:30:01 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:01.912591 | orchestrator | 2026-01-06 04:30:01 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:01.912634 | orchestrator | 2026-01-06 04:30:01 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:04.953806 | orchestrator | 2026-01-06 04:30:04 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:04.954815 | orchestrator | 2026-01-06 04:30:04 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:04.954959 | orchestrator | 2026-01-06 04:30:04 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:08.002941 | orchestrator | 2026-01-06 04:30:08 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:08.004894 | orchestrator | 2026-01-06 04:30:08 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:08.004938 | orchestrator | 2026-01-06 04:30:08 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:11.057784 | orchestrator | 2026-01-06 04:30:11 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:11.057988 | orchestrator | 2026-01-06 04:30:11 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:11.058254 | orchestrator | 2026-01-06 04:30:11 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:14.107901 | orchestrator | 2026-01-06 04:30:14 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:14.109759 | orchestrator | 2026-01-06 04:30:14 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:14.109818 | orchestrator | 2026-01-06 04:30:14 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:17.158794 | orchestrator | 2026-01-06 04:30:17 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:17.161867 | orchestrator | 2026-01-06 04:30:17 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:17.161899 | orchestrator | 2026-01-06 04:30:17 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:20.215809 | orchestrator | 2026-01-06 04:30:20 | INFO  | Task de94e18a-e5f1-4cb3-816d-70e021913dd3 is in state STARTED 2026-01-06 04:30:20.216875 | orchestrator | 2026-01-06 04:30:20 | INFO  | Task 7b21748d-c57a-4b51-995c-2c6f5f088b85 is in state STARTED 2026-01-06 04:30:20.216999 | orchestrator | 2026-01-06 04:30:20 | INFO  | Wait 1 second(s) until the next check 2026-01-06 04:30:21.352568 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2026-01-06 04:30:21.356262 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-01-06 04:30:22.157058 | 2026-01-06 04:30:22.157246 | PLAY [Post output play] 2026-01-06 04:30:22.176487 | 2026-01-06 04:30:22.176655 | LOOP [stage-output : Register sources] 2026-01-06 04:30:22.248089 | 2026-01-06 04:30:22.248455 | TASK [stage-output : Check sudo] 2026-01-06 04:30:23.187321 | orchestrator | sudo: a password is required 2026-01-06 04:30:23.302052 | orchestrator | ok: Runtime: 0:00:00.031904 2026-01-06 04:30:23.322138 | 2026-01-06 04:30:23.322405 | LOOP [stage-output : Set source and destination for files and folders] 2026-01-06 04:30:23.368306 | 2026-01-06 04:30:23.368592 | TASK [stage-output : Build a list of source, dest dictionaries] 2026-01-06 04:30:23.450619 | orchestrator | ok 2026-01-06 04:30:23.460489 | 2026-01-06 04:30:23.460645 | LOOP [stage-output : Ensure target folders exist] 2026-01-06 04:30:23.992285 | orchestrator | ok: "docs" 2026-01-06 04:30:23.992543 | 2026-01-06 04:30:24.250252 | orchestrator | ok: "artifacts" 2026-01-06 04:30:24.531372 | orchestrator | ok: "logs" 2026-01-06 04:30:24.554990 | 2026-01-06 04:30:24.555165 | LOOP [stage-output : Copy files and folders to staging folder] 2026-01-06 04:30:24.594166 | 2026-01-06 04:30:24.594482 | TASK [stage-output : Make all log files readable] 2026-01-06 04:30:24.905667 | orchestrator | ok 2026-01-06 04:30:24.914562 | 2026-01-06 04:30:24.914712 | TASK [stage-output : Rename log files that match extensions_to_txt] 2026-01-06 04:30:24.951987 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:24.972033 | 2026-01-06 04:30:24.972235 | TASK [stage-output : Discover log files for compression] 2026-01-06 04:30:24.999918 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:25.010496 | 2026-01-06 04:30:25.010660 | LOOP [stage-output : Archive everything from logs] 2026-01-06 04:30:25.063820 | 2026-01-06 04:30:25.064148 | PLAY [Post cleanup play] 2026-01-06 04:30:25.081018 | 2026-01-06 04:30:25.081238 | TASK [Set cloud fact (Zuul deployment)] 2026-01-06 04:30:25.150757 | orchestrator | ok 2026-01-06 04:30:25.162941 | 2026-01-06 04:30:25.163087 | TASK [Set cloud fact (local deployment)] 2026-01-06 04:30:25.198369 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:25.207009 | 2026-01-06 04:30:25.207162 | TASK [Clean the cloud environment] 2026-01-06 04:30:27.457559 | orchestrator | 2026-01-06 04:30:27 - clean up servers 2026-01-06 04:30:28.426889 | orchestrator | 2026-01-06 04:30:28 - testbed-manager 2026-01-06 04:30:28.515131 | orchestrator | 2026-01-06 04:30:28 - testbed-node-3 2026-01-06 04:30:28.606892 | orchestrator | 2026-01-06 04:30:28 - testbed-node-5 2026-01-06 04:30:28.698599 | orchestrator | 2026-01-06 04:30:28 - testbed-node-0 2026-01-06 04:30:28.798003 | orchestrator | 2026-01-06 04:30:28 - testbed-node-2 2026-01-06 04:30:28.905546 | orchestrator | 2026-01-06 04:30:28 - testbed-node-1 2026-01-06 04:30:28.997442 | orchestrator | 2026-01-06 04:30:28 - testbed-node-4 2026-01-06 04:30:29.088341 | orchestrator | 2026-01-06 04:30:29 - clean up keypairs 2026-01-06 04:30:29.110595 | orchestrator | 2026-01-06 04:30:29 - testbed 2026-01-06 04:30:29.135759 | orchestrator | 2026-01-06 04:30:29 - wait for servers to be gone 2026-01-06 04:30:38.106849 | orchestrator | 2026-01-06 04:30:38 - clean up ports 2026-01-06 04:30:38.304243 | orchestrator | 2026-01-06 04:30:38 - 01b61747-5c06-46f4-b949-ac0c8952d99d 2026-01-06 04:30:38.542717 | orchestrator | 2026-01-06 04:30:38 - 03f067b1-7721-4c19-b506-3a7648cef086 2026-01-06 04:30:38.803901 | orchestrator | 2026-01-06 04:30:38 - 5b6da3b4-63c8-417b-a706-ce0373deb712 2026-01-06 04:30:39.227569 | orchestrator | 2026-01-06 04:30:39 - 775de5b0-d0a8-4e00-a356-ce0b05a373fa 2026-01-06 04:30:39.502677 | orchestrator | 2026-01-06 04:30:39 - b6541ad3-79a3-4403-a4cd-a0fbf6c90b39 2026-01-06 04:30:39.751943 | orchestrator | 2026-01-06 04:30:39 - be9edca2-dc5a-4515-864d-9930325f2d2b 2026-01-06 04:30:39.967713 | orchestrator | 2026-01-06 04:30:39 - c312a59d-46f5-4dd5-b5ba-4c4c85200cba 2026-01-06 04:30:40.177170 | orchestrator | 2026-01-06 04:30:40 - clean up volumes 2026-01-06 04:30:40.307135 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-5-node-base 2026-01-06 04:30:40.346264 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-4-node-base 2026-01-06 04:30:40.392805 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-2-node-base 2026-01-06 04:30:40.435037 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-0-node-base 2026-01-06 04:30:40.475330 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-3-node-base 2026-01-06 04:30:40.518973 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-1-node-base 2026-01-06 04:30:40.561554 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-manager-base 2026-01-06 04:30:40.604654 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-5-node-5 2026-01-06 04:30:40.646549 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-4-node-4 2026-01-06 04:30:40.689648 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-0-node-3 2026-01-06 04:30:40.731241 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-3-node-3 2026-01-06 04:30:40.773004 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-8-node-5 2026-01-06 04:30:40.816243 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-7-node-4 2026-01-06 04:30:40.863739 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-1-node-4 2026-01-06 04:30:40.906216 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-6-node-3 2026-01-06 04:30:40.946672 | orchestrator | 2026-01-06 04:30:40 - testbed-volume-2-node-5 2026-01-06 04:30:40.993147 | orchestrator | 2026-01-06 04:30:40 - disconnect routers 2026-01-06 04:30:41.129080 | orchestrator | 2026-01-06 04:30:41 - testbed 2026-01-06 04:30:42.226001 | orchestrator | 2026-01-06 04:30:42 - clean up subnets 2026-01-06 04:30:42.273536 | orchestrator | 2026-01-06 04:30:42 - subnet-testbed-management 2026-01-06 04:30:42.459367 | orchestrator | 2026-01-06 04:30:42 - clean up networks 2026-01-06 04:30:42.676612 | orchestrator | 2026-01-06 04:30:42 - net-testbed-management 2026-01-06 04:30:42.984386 | orchestrator | 2026-01-06 04:30:42 - clean up security groups 2026-01-06 04:30:43.033496 | orchestrator | 2026-01-06 04:30:43 - testbed-management 2026-01-06 04:30:43.170881 | orchestrator | 2026-01-06 04:30:43 - testbed-node 2026-01-06 04:30:43.300460 | orchestrator | 2026-01-06 04:30:43 - clean up floating ips 2026-01-06 04:30:43.333606 | orchestrator | 2026-01-06 04:30:43 - 81.163.192.205 2026-01-06 04:30:43.769358 | orchestrator | 2026-01-06 04:30:43 - clean up routers 2026-01-06 04:30:43.880979 | orchestrator | 2026-01-06 04:30:43 - testbed 2026-01-06 04:30:45.763619 | orchestrator | ok: Runtime: 0:00:19.792620 2026-01-06 04:30:45.767354 | 2026-01-06 04:30:45.767494 | PLAY RECAP 2026-01-06 04:30:45.767600 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2026-01-06 04:30:45.767655 | 2026-01-06 04:30:45.935188 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-01-06 04:30:45.936339 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-01-06 04:30:46.801784 | 2026-01-06 04:30:46.801984 | PLAY [Cleanup play] 2026-01-06 04:30:46.818904 | 2026-01-06 04:30:46.819046 | TASK [Set cloud fact (Zuul deployment)] 2026-01-06 04:30:46.883734 | orchestrator | ok 2026-01-06 04:30:46.896218 | 2026-01-06 04:30:46.896639 | TASK [Set cloud fact (local deployment)] 2026-01-06 04:30:46.935371 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:46.952063 | 2026-01-06 04:30:46.952263 | TASK [Clean the cloud environment] 2026-01-06 04:30:48.174878 | orchestrator | 2026-01-06 04:30:48 - clean up servers 2026-01-06 04:30:48.784648 | orchestrator | 2026-01-06 04:30:48 - clean up keypairs 2026-01-06 04:30:48.802258 | orchestrator | 2026-01-06 04:30:48 - wait for servers to be gone 2026-01-06 04:30:48.845789 | orchestrator | 2026-01-06 04:30:48 - clean up ports 2026-01-06 04:30:48.929502 | orchestrator | 2026-01-06 04:30:48 - clean up volumes 2026-01-06 04:30:49.007349 | orchestrator | 2026-01-06 04:30:49 - disconnect routers 2026-01-06 04:30:49.042400 | orchestrator | 2026-01-06 04:30:49 - clean up subnets 2026-01-06 04:30:49.062911 | orchestrator | 2026-01-06 04:30:49 - clean up networks 2026-01-06 04:30:49.213569 | orchestrator | 2026-01-06 04:30:49 - clean up security groups 2026-01-06 04:30:49.249927 | orchestrator | 2026-01-06 04:30:49 - clean up floating ips 2026-01-06 04:30:49.274911 | orchestrator | 2026-01-06 04:30:49 - clean up routers 2026-01-06 04:30:49.493560 | orchestrator | ok: Runtime: 0:00:01.528116 2026-01-06 04:30:49.497581 | 2026-01-06 04:30:49.497753 | PLAY RECAP 2026-01-06 04:30:49.497925 | orchestrator | ok: 2 changed: 1 unreachable: 0 failed: 0 skipped: 1 rescued: 0 ignored: 0 2026-01-06 04:30:49.498001 | 2026-01-06 04:30:49.646314 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-01-06 04:30:49.647509 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-01-06 04:30:50.524942 | 2026-01-06 04:30:50.525137 | PLAY [Base post-fetch] 2026-01-06 04:30:50.541966 | 2026-01-06 04:30:50.542133 | TASK [fetch-output : Set log path for multiple nodes] 2026-01-06 04:30:50.597759 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:50.613521 | 2026-01-06 04:30:50.613770 | TASK [fetch-output : Set log path for single node] 2026-01-06 04:30:50.670476 | orchestrator | ok 2026-01-06 04:30:50.680545 | 2026-01-06 04:30:50.680716 | LOOP [fetch-output : Ensure local output dirs] 2026-01-06 04:30:51.213229 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/logs" 2026-01-06 04:30:51.527187 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/artifacts" 2026-01-06 04:30:51.824335 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/41c85c8fe207460eae3f3f1c1a35cf27/work/docs" 2026-01-06 04:30:51.842785 | 2026-01-06 04:30:51.843059 | LOOP [fetch-output : Collect logs, artifacts and docs] 2026-01-06 04:30:52.954414 | orchestrator | changed: .d..t...... ./ 2026-01-06 04:30:52.954744 | orchestrator | changed: All items complete 2026-01-06 04:30:52.954957 | 2026-01-06 04:30:53.692528 | orchestrator | changed: .d..t...... ./ 2026-01-06 04:30:54.477584 | orchestrator | changed: .d..t...... ./ 2026-01-06 04:30:54.505242 | 2026-01-06 04:30:54.505423 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2026-01-06 04:30:54.549701 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:54.555293 | orchestrator | skipping: Conditional result was False 2026-01-06 04:30:54.579709 | 2026-01-06 04:30:54.579922 | PLAY RECAP 2026-01-06 04:30:54.580003 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2026-01-06 04:30:54.580042 | 2026-01-06 04:30:54.719414 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-01-06 04:30:54.721210 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-01-06 04:30:55.547274 | 2026-01-06 04:30:55.547452 | PLAY [Base post] 2026-01-06 04:30:55.562741 | 2026-01-06 04:30:55.562993 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2026-01-06 04:30:56.578669 | orchestrator | changed 2026-01-06 04:30:56.589424 | 2026-01-06 04:30:56.589590 | PLAY RECAP 2026-01-06 04:30:56.589683 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2026-01-06 04:30:56.589773 | 2026-01-06 04:30:56.733572 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-01-06 04:30:56.734681 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2026-01-06 04:30:57.615807 | 2026-01-06 04:30:57.616040 | PLAY [Base post-logs] 2026-01-06 04:30:57.627181 | 2026-01-06 04:30:57.627328 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2026-01-06 04:30:58.134935 | localhost | changed 2026-01-06 04:30:58.146652 | 2026-01-06 04:30:58.146911 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2026-01-06 04:30:58.184913 | localhost | ok 2026-01-06 04:30:58.190486 | 2026-01-06 04:30:58.190650 | TASK [Set zuul-log-path fact] 2026-01-06 04:30:58.208772 | localhost | ok 2026-01-06 04:30:58.219813 | 2026-01-06 04:30:58.219991 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-01-06 04:30:58.258439 | localhost | ok 2026-01-06 04:30:58.264835 | 2026-01-06 04:30:58.265039 | TASK [upload-logs : Create log directories] 2026-01-06 04:30:58.839232 | localhost | changed 2026-01-06 04:30:58.844288 | 2026-01-06 04:30:58.844458 | TASK [upload-logs : Ensure logs are readable before uploading] 2026-01-06 04:30:59.448011 | localhost -> localhost | ok: Runtime: 0:00:00.006936 2026-01-06 04:30:59.457290 | 2026-01-06 04:30:59.457511 | TASK [upload-logs : Upload logs to log server] 2026-01-06 04:31:00.082518 | localhost | Output suppressed because no_log was given 2026-01-06 04:31:00.085576 | 2026-01-06 04:31:00.085754 | LOOP [upload-logs : Compress console log and json output] 2026-01-06 04:31:00.137519 | localhost | skipping: Conditional result was False 2026-01-06 04:31:00.143556 | localhost | skipping: Conditional result was False 2026-01-06 04:31:00.157254 | 2026-01-06 04:31:00.157483 | LOOP [upload-logs : Upload compressed console log and json output] 2026-01-06 04:31:00.214065 | localhost | skipping: Conditional result was False 2026-01-06 04:31:00.214545 | 2026-01-06 04:31:00.218805 | localhost | skipping: Conditional result was False 2026-01-06 04:31:00.231576 | 2026-01-06 04:31:00.231947 | LOOP [upload-logs : Upload console log and json output]