Sep 4 00:59:03.908789 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Sep 3 22:05:39 -00 2025 Sep 4 00:59:03.908819 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:59:03.908826 kernel: BIOS-provided physical RAM map: Sep 4 00:59:03.908830 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Sep 4 00:59:03.908834 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Sep 4 00:59:03.908838 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 4 00:59:03.908843 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 4 00:59:03.908847 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 4 00:59:03.908851 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081a79fff] usable Sep 4 00:59:03.908856 kernel: BIOS-e820: [mem 0x0000000081a7a000-0x0000000081a7afff] ACPI NVS Sep 4 00:59:03.908860 kernel: BIOS-e820: [mem 0x0000000081a7b000-0x0000000081a7bfff] reserved Sep 4 00:59:03.908864 kernel: BIOS-e820: [mem 0x0000000081a7c000-0x000000008afccfff] usable Sep 4 00:59:03.908868 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Sep 4 00:59:03.908872 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Sep 4 00:59:03.908877 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Sep 4 00:59:03.908883 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Sep 4 00:59:03.908887 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Sep 4 00:59:03.908892 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Sep 4 00:59:03.908896 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 4 00:59:03.908901 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 4 00:59:03.908905 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 4 00:59:03.908910 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 4 00:59:03.908914 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 4 00:59:03.908919 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Sep 4 00:59:03.908923 kernel: NX (Execute Disable) protection: active Sep 4 00:59:03.908928 kernel: APIC: Static calls initialized Sep 4 00:59:03.908933 kernel: SMBIOS 3.2.1 present. Sep 4 00:59:03.908938 kernel: DMI: Supermicro X11SCM-F/X11SCM-F, BIOS 1.9 09/16/2022 Sep 4 00:59:03.908942 kernel: DMI: Memory slots populated: 2/4 Sep 4 00:59:03.908947 kernel: tsc: Detected 3400.000 MHz processor Sep 4 00:59:03.908951 kernel: tsc: Detected 3399.906 MHz TSC Sep 4 00:59:03.908956 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 00:59:03.908961 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 00:59:03.908965 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Sep 4 00:59:03.908970 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 4 00:59:03.908975 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 00:59:03.908980 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Sep 4 00:59:03.908985 kernel: Using GB pages for direct mapping Sep 4 00:59:03.908990 kernel: ACPI: Early table checksum verification disabled Sep 4 00:59:03.908995 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 4 00:59:03.909001 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 4 00:59:03.909006 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Sep 4 00:59:03.909012 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 4 00:59:03.909017 kernel: ACPI: FACS 0x000000008C66CF80 000040 Sep 4 00:59:03.909022 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Sep 4 00:59:03.909027 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Sep 4 00:59:03.909032 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 4 00:59:03.909037 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 4 00:59:03.909041 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 4 00:59:03.909046 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 4 00:59:03.909052 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 4 00:59:03.909057 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 4 00:59:03.909062 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:59:03.909067 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 4 00:59:03.909072 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 4 00:59:03.909077 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:59:03.909081 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:59:03.909086 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 4 00:59:03.909092 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 4 00:59:03.909097 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:59:03.909102 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:59:03.909107 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 4 00:59:03.909112 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Sep 4 00:59:03.909117 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 4 00:59:03.909122 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 4 00:59:03.909127 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 4 00:59:03.909132 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Sep 4 00:59:03.909138 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 4 00:59:03.909143 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 4 00:59:03.909147 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 4 00:59:03.909152 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 4 00:59:03.909157 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 4 00:59:03.909162 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Sep 4 00:59:03.909167 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Sep 4 00:59:03.909172 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Sep 4 00:59:03.909178 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Sep 4 00:59:03.909183 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Sep 4 00:59:03.909187 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Sep 4 00:59:03.909192 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Sep 4 00:59:03.909197 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Sep 4 00:59:03.909202 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Sep 4 00:59:03.909207 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Sep 4 00:59:03.909212 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Sep 4 00:59:03.909216 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Sep 4 00:59:03.909221 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Sep 4 00:59:03.909227 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Sep 4 00:59:03.909232 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Sep 4 00:59:03.909237 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Sep 4 00:59:03.909241 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Sep 4 00:59:03.909246 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Sep 4 00:59:03.909251 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Sep 4 00:59:03.909256 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Sep 4 00:59:03.909261 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Sep 4 00:59:03.909266 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Sep 4 00:59:03.909271 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Sep 4 00:59:03.909276 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Sep 4 00:59:03.909281 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Sep 4 00:59:03.909286 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Sep 4 00:59:03.909290 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Sep 4 00:59:03.909295 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Sep 4 00:59:03.909300 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Sep 4 00:59:03.909305 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Sep 4 00:59:03.909310 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Sep 4 00:59:03.909316 kernel: No NUMA configuration found Sep 4 00:59:03.909321 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Sep 4 00:59:03.909325 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Sep 4 00:59:03.909330 kernel: Zone ranges: Sep 4 00:59:03.909335 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 00:59:03.909340 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 4 00:59:03.909345 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Sep 4 00:59:03.909350 kernel: Device empty Sep 4 00:59:03.909355 kernel: Movable zone start for each node Sep 4 00:59:03.909360 kernel: Early memory node ranges Sep 4 00:59:03.909365 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Sep 4 00:59:03.909371 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 4 00:59:03.909375 kernel: node 0: [mem 0x0000000040400000-0x0000000081a79fff] Sep 4 00:59:03.909380 kernel: node 0: [mem 0x0000000081a7c000-0x000000008afccfff] Sep 4 00:59:03.909385 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Sep 4 00:59:03.909393 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Sep 4 00:59:03.909399 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Sep 4 00:59:03.909404 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Sep 4 00:59:03.909409 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 00:59:03.909415 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Sep 4 00:59:03.909421 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 4 00:59:03.909426 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 4 00:59:03.909431 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Sep 4 00:59:03.909436 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Sep 4 00:59:03.909441 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Sep 4 00:59:03.909447 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Sep 4 00:59:03.909452 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 4 00:59:03.909458 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 4 00:59:03.909463 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 4 00:59:03.909468 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 4 00:59:03.909473 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 4 00:59:03.909478 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 4 00:59:03.909484 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 4 00:59:03.909489 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 4 00:59:03.909494 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 4 00:59:03.909499 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 4 00:59:03.909505 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 4 00:59:03.909510 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 4 00:59:03.909515 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 4 00:59:03.909520 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 4 00:59:03.909526 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 4 00:59:03.909531 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 4 00:59:03.909536 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 4 00:59:03.909541 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 4 00:59:03.909546 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 00:59:03.909552 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 00:59:03.909558 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 00:59:03.909563 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 4 00:59:03.909568 kernel: TSC deadline timer available Sep 4 00:59:03.909573 kernel: CPU topo: Max. logical packages: 1 Sep 4 00:59:03.909578 kernel: CPU topo: Max. logical dies: 1 Sep 4 00:59:03.909584 kernel: CPU topo: Max. dies per package: 1 Sep 4 00:59:03.909589 kernel: CPU topo: Max. threads per core: 2 Sep 4 00:59:03.909594 kernel: CPU topo: Num. cores per package: 8 Sep 4 00:59:03.909599 kernel: CPU topo: Num. threads per package: 16 Sep 4 00:59:03.909605 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 4 00:59:03.909610 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Sep 4 00:59:03.909616 kernel: Booting paravirtualized kernel on bare hardware Sep 4 00:59:03.909621 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 00:59:03.909626 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 4 00:59:03.909632 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 4 00:59:03.909637 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 4 00:59:03.909642 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 4 00:59:03.909648 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:59:03.909654 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 00:59:03.909659 kernel: random: crng init done Sep 4 00:59:03.909664 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 4 00:59:03.909673 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 4 00:59:03.909678 kernel: Fallback order for Node 0: 0 Sep 4 00:59:03.909683 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363245 Sep 4 00:59:03.909688 kernel: Policy zone: Normal Sep 4 00:59:03.909694 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 00:59:03.909700 kernel: software IO TLB: area num 16. Sep 4 00:59:03.909705 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 4 00:59:03.909710 kernel: ftrace: allocating 40099 entries in 157 pages Sep 4 00:59:03.909716 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 00:59:03.909721 kernel: Dynamic Preempt: voluntary Sep 4 00:59:03.909726 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 00:59:03.909732 kernel: rcu: RCU event tracing is enabled. Sep 4 00:59:03.909737 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 4 00:59:03.909742 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 00:59:03.909748 kernel: Rude variant of Tasks RCU enabled. Sep 4 00:59:03.909753 kernel: Tracing variant of Tasks RCU enabled. Sep 4 00:59:03.909759 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 00:59:03.909764 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 4 00:59:03.909769 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:59:03.909774 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:59:03.909780 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:59:03.909785 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 4 00:59:03.909790 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 00:59:03.909796 kernel: Console: colour VGA+ 80x25 Sep 4 00:59:03.909801 kernel: printk: legacy console [tty0] enabled Sep 4 00:59:03.909807 kernel: printk: legacy console [ttyS1] enabled Sep 4 00:59:03.909812 kernel: ACPI: Core revision 20240827 Sep 4 00:59:03.909817 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Sep 4 00:59:03.909822 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 00:59:03.909827 kernel: DMAR: Host address width 39 Sep 4 00:59:03.909833 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 4 00:59:03.909838 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 4 00:59:03.909844 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Sep 4 00:59:03.909849 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Sep 4 00:59:03.909854 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 4 00:59:03.909860 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 4 00:59:03.909865 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 4 00:59:03.909870 kernel: x2apic enabled Sep 4 00:59:03.909875 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 4 00:59:03.909880 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 4 00:59:03.909886 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 4 00:59:03.909892 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 4 00:59:03.909897 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 4 00:59:03.909902 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 4 00:59:03.909907 kernel: process: using mwait in idle threads Sep 4 00:59:03.909912 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 00:59:03.909917 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 4 00:59:03.909923 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 4 00:59:03.909928 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 4 00:59:03.909933 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 4 00:59:03.909938 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 00:59:03.909943 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 00:59:03.909949 kernel: TAA: Mitigation: TSX disabled Sep 4 00:59:03.909954 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Sep 4 00:59:03.909959 kernel: SRBDS: Mitigation: Microcode Sep 4 00:59:03.909964 kernel: GDS: Vulnerable: No microcode Sep 4 00:59:03.909969 kernel: active return thunk: its_return_thunk Sep 4 00:59:03.909974 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 4 00:59:03.909979 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 00:59:03.909984 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 00:59:03.909989 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 00:59:03.909995 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 4 00:59:03.910000 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 4 00:59:03.910006 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 00:59:03.910011 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 4 00:59:03.910016 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 4 00:59:03.910021 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 4 00:59:03.910026 kernel: Freeing SMP alternatives memory: 32K Sep 4 00:59:03.910031 kernel: pid_max: default: 32768 minimum: 301 Sep 4 00:59:03.910036 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 00:59:03.910041 kernel: landlock: Up and running. Sep 4 00:59:03.910046 kernel: SELinux: Initializing. Sep 4 00:59:03.910051 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 00:59:03.910057 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 00:59:03.910063 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 4 00:59:03.910068 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 4 00:59:03.910073 kernel: ... version: 4 Sep 4 00:59:03.910078 kernel: ... bit width: 48 Sep 4 00:59:03.910084 kernel: ... generic registers: 4 Sep 4 00:59:03.910089 kernel: ... value mask: 0000ffffffffffff Sep 4 00:59:03.910094 kernel: ... max period: 00007fffffffffff Sep 4 00:59:03.910099 kernel: ... fixed-purpose events: 3 Sep 4 00:59:03.910104 kernel: ... event mask: 000000070000000f Sep 4 00:59:03.910109 kernel: signal: max sigframe size: 2032 Sep 4 00:59:03.910116 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 4 00:59:03.910121 kernel: rcu: Hierarchical SRCU implementation. Sep 4 00:59:03.910126 kernel: rcu: Max phase no-delay instances is 400. Sep 4 00:59:03.910131 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 4 00:59:03.910137 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 4 00:59:03.910142 kernel: smp: Bringing up secondary CPUs ... Sep 4 00:59:03.910147 kernel: smpboot: x86: Booting SMP configuration: Sep 4 00:59:03.910152 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 4 00:59:03.910158 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 4 00:59:03.910164 kernel: smp: Brought up 1 node, 16 CPUs Sep 4 00:59:03.910169 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 4 00:59:03.910175 kernel: Memory: 32697216K/33452980K available (14336K kernel code, 2428K rwdata, 9956K rodata, 53832K init, 1088K bss, 730464K reserved, 0K cma-reserved) Sep 4 00:59:03.910180 kernel: devtmpfs: initialized Sep 4 00:59:03.910185 kernel: x86/mm: Memory block size: 128MB Sep 4 00:59:03.910190 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81a7a000-0x81a7afff] (4096 bytes) Sep 4 00:59:03.910196 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Sep 4 00:59:03.910201 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 00:59:03.910207 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 4 00:59:03.910212 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 00:59:03.910217 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 00:59:03.910222 kernel: audit: initializing netlink subsys (disabled) Sep 4 00:59:03.910228 kernel: audit: type=2000 audit(1756947536.040:1): state=initialized audit_enabled=0 res=1 Sep 4 00:59:03.910233 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 00:59:03.910238 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 00:59:03.910243 kernel: cpuidle: using governor menu Sep 4 00:59:03.910248 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 00:59:03.910254 kernel: dca service started, version 1.12.1 Sep 4 00:59:03.910260 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 4 00:59:03.910265 kernel: PCI: Using configuration type 1 for base access Sep 4 00:59:03.910270 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 00:59:03.910275 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 00:59:03.910280 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 00:59:03.910286 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 00:59:03.910291 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 00:59:03.910296 kernel: ACPI: Added _OSI(Module Device) Sep 4 00:59:03.910302 kernel: ACPI: Added _OSI(Processor Device) Sep 4 00:59:03.910307 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 00:59:03.910313 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 4 00:59:03.910318 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910323 kernel: ACPI: SSDT 0xFFFF8C20C2105C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Sep 4 00:59:03.910328 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910334 kernel: ACPI: SSDT 0xFFFF8C20C20FD000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 4 00:59:03.910339 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910344 kernel: ACPI: SSDT 0xFFFF8C20C1699A00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 4 00:59:03.910350 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910355 kernel: ACPI: SSDT 0xFFFF8C20C0F99800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 4 00:59:03.910360 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910365 kernel: ACPI: SSDT 0xFFFF8C20C0FA4000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 4 00:59:03.910370 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:59:03.910376 kernel: ACPI: SSDT 0xFFFF8C20C1801800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Sep 4 00:59:03.910381 kernel: ACPI: Interpreter enabled Sep 4 00:59:03.910386 kernel: ACPI: PM: (supports S0 S5) Sep 4 00:59:03.910391 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 00:59:03.910397 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 4 00:59:03.910403 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 4 00:59:03.910408 kernel: HEST: Table parsing has been initialized. Sep 4 00:59:03.910413 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 4 00:59:03.910418 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 00:59:03.910424 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 00:59:03.910429 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 4 00:59:03.910434 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 4 00:59:03.910440 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 4 00:59:03.910445 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 4 00:59:03.910451 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 4 00:59:03.910456 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 4 00:59:03.910461 kernel: ACPI: \_TZ_.FN00: New power resource Sep 4 00:59:03.910467 kernel: ACPI: \_TZ_.FN01: New power resource Sep 4 00:59:03.910472 kernel: ACPI: \_TZ_.FN02: New power resource Sep 4 00:59:03.910477 kernel: ACPI: \_TZ_.FN03: New power resource Sep 4 00:59:03.910482 kernel: ACPI: \_TZ_.FN04: New power resource Sep 4 00:59:03.910488 kernel: ACPI: \PIN_: New power resource Sep 4 00:59:03.910493 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 4 00:59:03.910568 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 00:59:03.910619 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 4 00:59:03.910665 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 4 00:59:03.910675 kernel: PCI host bridge to bus 0000:00 Sep 4 00:59:03.910725 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 00:59:03.910768 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 00:59:03.910811 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 00:59:03.910852 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Sep 4 00:59:03.910893 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 4 00:59:03.910933 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 4 00:59:03.910993 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 4 00:59:03.911050 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.911101 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:59:03.911150 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 4 00:59:03.911197 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:59:03.911245 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.911299 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 4 00:59:03.911347 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Sep 4 00:59:03.911399 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 4 00:59:03.911449 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551e000-0x9551efff 64bit] Sep 4 00:59:03.911500 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 4 00:59:03.911548 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Sep 4 00:59:03.911595 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 4 00:59:03.911645 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 4 00:59:03.911696 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Sep 4 00:59:03.911760 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551d000-0x9551dfff 64bit] Sep 4 00:59:03.911818 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:59:03.911866 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:59:03.911917 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:59:03.911963 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:59:03.912015 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:59:03.912063 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Sep 4 00:59:03.912109 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 4 00:59:03.912160 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:59:03.912236 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Sep 4 00:59:03.912283 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 4 00:59:03.912333 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:59:03.912380 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Sep 4 00:59:03.912427 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 4 00:59:03.912476 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Sep 4 00:59:03.912522 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Sep 4 00:59:03.912568 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Sep 4 00:59:03.912615 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Sep 4 00:59:03.912660 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Sep 4 00:59:03.912776 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Sep 4 00:59:03.912822 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Sep 4 00:59:03.912867 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 4 00:59:03.912920 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.912968 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:59:03.913017 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.913068 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.913118 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:59:03.913164 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 4 00:59:03.913210 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 4 00:59:03.913256 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.913308 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.913357 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:59:03.913403 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 4 00:59:03.913449 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 4 00:59:03.913495 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.913545 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.913592 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:59:03.913638 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.913714 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Sep 4 00:59:03.913775 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:59:03.913822 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 4 00:59:03.913870 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:59:03.913916 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.913966 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:59:03.914013 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:59:03.914065 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 4 00:59:03.914118 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 4 00:59:03.914165 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Sep 4 00:59:03.914211 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 4 00:59:03.914262 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:59:03.914309 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 4 00:59:03.914363 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 4 00:59:03.914413 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 4 00:59:03.914460 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Sep 4 00:59:03.914507 kernel: pci 0000:01:00.0: PME# supported from D3cold Sep 4 00:59:03.914555 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 4 00:59:03.914602 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 4 00:59:03.914655 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 4 00:59:03.914743 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 4 00:59:03.914791 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Sep 4 00:59:03.914838 kernel: pci 0000:01:00.1: PME# supported from D3cold Sep 4 00:59:03.914885 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 4 00:59:03.914932 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 4 00:59:03.914980 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:59:03.915028 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:59:03.915081 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Sep 4 00:59:03.915129 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 4 00:59:03.915177 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Sep 4 00:59:03.915224 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Sep 4 00:59:03.915271 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Sep 4 00:59:03.915318 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.915365 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:59:03.915420 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 4 00:59:03.915469 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 4 00:59:03.915516 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Sep 4 00:59:03.915563 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Sep 4 00:59:03.915611 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Sep 4 00:59:03.915658 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 4 00:59:03.915739 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:59:03.915788 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:59:03.915842 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 4 00:59:03.915891 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:59:03.915938 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 4 00:59:03.915986 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:59:03.916033 kernel: pci 0000:06:00.0: enabling Extended Tags Sep 4 00:59:03.916081 kernel: pci 0000:06:00.0: supports D1 D2 Sep 4 00:59:03.916131 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 00:59:03.916178 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:59:03.916228 kernel: pci_bus 0000:07: extended config space not accessible Sep 4 00:59:03.916284 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 4 00:59:03.916334 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Sep 4 00:59:03.916384 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Sep 4 00:59:03.916433 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Sep 4 00:59:03.916484 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 00:59:03.916534 kernel: pci 0000:07:00.0: supports D1 D2 Sep 4 00:59:03.916584 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 00:59:03.916693 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:59:03.916702 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 4 00:59:03.916724 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 4 00:59:03.916730 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 4 00:59:03.916735 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 4 00:59:03.916742 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 4 00:59:03.916748 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 4 00:59:03.916753 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 4 00:59:03.916758 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 4 00:59:03.916764 kernel: iommu: Default domain type: Translated Sep 4 00:59:03.916769 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 00:59:03.916775 kernel: PCI: Using ACPI for IRQ routing Sep 4 00:59:03.916780 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 00:59:03.916786 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Sep 4 00:59:03.916792 kernel: e820: reserve RAM buffer [mem 0x81a7a000-0x83ffffff] Sep 4 00:59:03.916797 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Sep 4 00:59:03.916803 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Sep 4 00:59:03.916808 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Sep 4 00:59:03.916813 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Sep 4 00:59:03.916862 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Sep 4 00:59:03.916912 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Sep 4 00:59:03.916961 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 00:59:03.916969 kernel: vgaarb: loaded Sep 4 00:59:03.916976 kernel: clocksource: Switched to clocksource tsc-early Sep 4 00:59:03.916982 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 00:59:03.916987 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 00:59:03.916993 kernel: pnp: PnP ACPI init Sep 4 00:59:03.917084 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 4 00:59:03.917169 kernel: pnp 00:02: [dma 0 disabled] Sep 4 00:59:03.917269 kernel: pnp 00:03: [dma 0 disabled] Sep 4 00:59:03.917379 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 4 00:59:03.917424 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 4 00:59:03.917470 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 4 00:59:03.917514 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 4 00:59:03.917558 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 4 00:59:03.917601 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 4 00:59:03.917646 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 4 00:59:03.917697 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 4 00:59:03.917741 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 4 00:59:03.917786 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 4 00:59:03.917833 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 4 00:59:03.917878 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 4 00:59:03.917922 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 4 00:59:03.917967 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 4 00:59:03.918040 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 4 00:59:03.918137 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 4 00:59:03.918222 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 4 00:59:03.918354 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 4 00:59:03.918384 kernel: pnp: PnP ACPI: found 9 devices Sep 4 00:59:03.918390 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 00:59:03.918398 kernel: NET: Registered PF_INET protocol family Sep 4 00:59:03.918406 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 00:59:03.918412 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 4 00:59:03.918418 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 00:59:03.918424 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 00:59:03.918430 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 4 00:59:03.918437 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 4 00:59:03.918442 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 00:59:03.918448 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 00:59:03.918472 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 00:59:03.918478 kernel: NET: Registered PF_XDP protocol family Sep 4 00:59:03.918570 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Sep 4 00:59:03.918632 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Sep 4 00:59:03.918709 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Sep 4 00:59:03.918760 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 4 00:59:03.918810 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 4 00:59:03.918862 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 4 00:59:03.918911 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 4 00:59:03.918961 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:59:03.919009 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 4 00:59:03.919088 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:59:03.919181 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:59:03.919270 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:59:03.919349 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 4 00:59:03.919396 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 4 00:59:03.919444 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:59:03.919493 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 4 00:59:03.919542 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 4 00:59:03.919589 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:59:03.919637 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:59:03.919704 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 4 00:59:03.919769 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:59:03.919851 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:59:03.919916 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 4 00:59:03.919977 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:59:03.920021 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 4 00:59:03.920063 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 00:59:03.920104 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 00:59:03.920146 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 00:59:03.920187 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Sep 4 00:59:03.920228 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 4 00:59:03.920277 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Sep 4 00:59:03.920324 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:59:03.920372 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Sep 4 00:59:03.920415 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Sep 4 00:59:03.920465 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 4 00:59:03.920509 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Sep 4 00:59:03.920557 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Sep 4 00:59:03.920601 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Sep 4 00:59:03.920647 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 4 00:59:03.920724 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Sep 4 00:59:03.920745 kernel: PCI: CLS 64 bytes, default 64 Sep 4 00:59:03.920751 kernel: DMAR: No ATSR found Sep 4 00:59:03.920757 kernel: DMAR: No SATC found Sep 4 00:59:03.920762 kernel: DMAR: dmar0: Using Queued invalidation Sep 4 00:59:03.920811 kernel: pci 0000:00:00.0: Adding to iommu group 0 Sep 4 00:59:03.920859 kernel: pci 0000:00:01.0: Adding to iommu group 1 Sep 4 00:59:03.920905 kernel: pci 0000:00:08.0: Adding to iommu group 2 Sep 4 00:59:03.920952 kernel: pci 0000:00:12.0: Adding to iommu group 3 Sep 4 00:59:03.920999 kernel: pci 0000:00:14.0: Adding to iommu group 4 Sep 4 00:59:03.921046 kernel: pci 0000:00:14.2: Adding to iommu group 4 Sep 4 00:59:03.921092 kernel: pci 0000:00:15.0: Adding to iommu group 5 Sep 4 00:59:03.921138 kernel: pci 0000:00:15.1: Adding to iommu group 5 Sep 4 00:59:03.921186 kernel: pci 0000:00:16.0: Adding to iommu group 6 Sep 4 00:59:03.921232 kernel: pci 0000:00:16.1: Adding to iommu group 6 Sep 4 00:59:03.921308 kernel: pci 0000:00:16.4: Adding to iommu group 6 Sep 4 00:59:03.921354 kernel: pci 0000:00:17.0: Adding to iommu group 7 Sep 4 00:59:03.921400 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Sep 4 00:59:03.921448 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Sep 4 00:59:03.921494 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Sep 4 00:59:03.921541 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Sep 4 00:59:03.921590 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Sep 4 00:59:03.921637 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Sep 4 00:59:03.921705 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Sep 4 00:59:03.921780 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Sep 4 00:59:03.921826 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Sep 4 00:59:03.921874 kernel: pci 0000:01:00.0: Adding to iommu group 1 Sep 4 00:59:03.921922 kernel: pci 0000:01:00.1: Adding to iommu group 1 Sep 4 00:59:03.921970 kernel: pci 0000:03:00.0: Adding to iommu group 15 Sep 4 00:59:03.922019 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 4 00:59:03.922067 kernel: pci 0000:06:00.0: Adding to iommu group 17 Sep 4 00:59:03.922116 kernel: pci 0000:07:00.0: Adding to iommu group 17 Sep 4 00:59:03.922124 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 4 00:59:03.922130 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 4 00:59:03.922136 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Sep 4 00:59:03.922142 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Sep 4 00:59:03.922147 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 4 00:59:03.922154 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 4 00:59:03.922160 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 4 00:59:03.922208 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 4 00:59:03.922217 kernel: Initialise system trusted keyrings Sep 4 00:59:03.922222 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 4 00:59:03.922228 kernel: Key type asymmetric registered Sep 4 00:59:03.922233 kernel: Asymmetric key parser 'x509' registered Sep 4 00:59:03.922239 kernel: tsc: Refined TSC clocksource calibration: 3408.000 MHz Sep 4 00:59:03.922244 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Sep 4 00:59:03.922251 kernel: clocksource: Switched to clocksource tsc Sep 4 00:59:03.922257 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 00:59:03.922263 kernel: io scheduler mq-deadline registered Sep 4 00:59:03.922268 kernel: io scheduler kyber registered Sep 4 00:59:03.922273 kernel: io scheduler bfq registered Sep 4 00:59:03.922320 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Sep 4 00:59:03.922367 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Sep 4 00:59:03.922413 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Sep 4 00:59:03.922461 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Sep 4 00:59:03.922509 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Sep 4 00:59:03.922555 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Sep 4 00:59:03.922607 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 4 00:59:03.922616 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 4 00:59:03.922621 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 4 00:59:03.922627 kernel: pstore: Using crash dump compression: deflate Sep 4 00:59:03.922632 kernel: pstore: Registered erst as persistent store backend Sep 4 00:59:03.922637 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 00:59:03.922660 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 00:59:03.922666 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 00:59:03.922675 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 4 00:59:03.922695 kernel: hpet_acpi_add: no address or irqs in _CRS Sep 4 00:59:03.922775 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 4 00:59:03.922783 kernel: i8042: PNP: No PS/2 controller found. Sep 4 00:59:03.922825 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 4 00:59:03.922868 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 4 00:59:03.922913 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-04T00:59:02 UTC (1756947542) Sep 4 00:59:03.922956 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 4 00:59:03.922964 kernel: intel_pstate: Intel P-state driver initializing Sep 4 00:59:03.922970 kernel: intel_pstate: Disabling energy efficiency optimization Sep 4 00:59:03.922975 kernel: intel_pstate: HWP enabled Sep 4 00:59:03.922981 kernel: NET: Registered PF_INET6 protocol family Sep 4 00:59:03.922986 kernel: Segment Routing with IPv6 Sep 4 00:59:03.922991 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 00:59:03.922998 kernel: NET: Registered PF_PACKET protocol family Sep 4 00:59:03.923004 kernel: Key type dns_resolver registered Sep 4 00:59:03.923009 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 4 00:59:03.923014 kernel: microcode: Current revision: 0x000000f4 Sep 4 00:59:03.923019 kernel: IPI shorthand broadcast: enabled Sep 4 00:59:03.923025 kernel: sched_clock: Marking stable (4610291102, 1495199159)->(6731631884, -626141623) Sep 4 00:59:03.923031 kernel: registered taskstats version 1 Sep 4 00:59:03.923036 kernel: Loading compiled-in X.509 certificates Sep 4 00:59:03.923041 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 247a8159a15e16f8eb89737aa66cd9cf9bbb3c10' Sep 4 00:59:03.923048 kernel: Demotion targets for Node 0: null Sep 4 00:59:03.923053 kernel: Key type .fscrypt registered Sep 4 00:59:03.923058 kernel: Key type fscrypt-provisioning registered Sep 4 00:59:03.923064 kernel: ima: Allocated hash algorithm: sha1 Sep 4 00:59:03.923069 kernel: ima: No architecture policies found Sep 4 00:59:03.923074 kernel: clk: Disabling unused clocks Sep 4 00:59:03.923080 kernel: Warning: unable to open an initial console. Sep 4 00:59:03.923085 kernel: Freeing unused kernel image (initmem) memory: 53832K Sep 4 00:59:03.923091 kernel: Write protecting the kernel read-only data: 24576k Sep 4 00:59:03.923097 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Sep 4 00:59:03.923103 kernel: Run /init as init process Sep 4 00:59:03.923108 kernel: with arguments: Sep 4 00:59:03.923114 kernel: /init Sep 4 00:59:03.923119 kernel: with environment: Sep 4 00:59:03.923124 kernel: HOME=/ Sep 4 00:59:03.923129 kernel: TERM=linux Sep 4 00:59:03.923135 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 00:59:03.923141 systemd[1]: Successfully made /usr/ read-only. Sep 4 00:59:03.923149 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 00:59:03.923155 systemd[1]: Detected architecture x86-64. Sep 4 00:59:03.923161 systemd[1]: Running in initrd. Sep 4 00:59:03.923166 systemd[1]: No hostname configured, using default hostname. Sep 4 00:59:03.923172 systemd[1]: Hostname set to . Sep 4 00:59:03.923178 systemd[1]: Initializing machine ID from random generator. Sep 4 00:59:03.923184 systemd[1]: Queued start job for default target initrd.target. Sep 4 00:59:03.923191 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:59:03.923196 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:59:03.923202 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 00:59:03.923208 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 00:59:03.923214 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 00:59:03.923220 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 00:59:03.923226 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 00:59:03.923233 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 00:59:03.923239 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:59:03.923245 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:59:03.923250 systemd[1]: Reached target paths.target - Path Units. Sep 4 00:59:03.923256 systemd[1]: Reached target slices.target - Slice Units. Sep 4 00:59:03.923262 systemd[1]: Reached target swap.target - Swaps. Sep 4 00:59:03.923268 systemd[1]: Reached target timers.target - Timer Units. Sep 4 00:59:03.923274 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 00:59:03.923280 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 00:59:03.923286 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 00:59:03.923292 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 00:59:03.923298 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:59:03.923304 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 00:59:03.923309 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:59:03.923315 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 00:59:03.923321 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 00:59:03.923327 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 00:59:03.923333 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 00:59:03.923340 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 00:59:03.923345 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 00:59:03.923351 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 00:59:03.923357 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 00:59:03.923373 systemd-journald[297]: Collecting audit messages is disabled. Sep 4 00:59:03.923388 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:59:03.923394 systemd-journald[297]: Journal started Sep 4 00:59:03.923408 systemd-journald[297]: Runtime Journal (/run/log/journal/e9ad2b00da714333872708024d34c026) is 8M, max 640.1M, 632.1M free. Sep 4 00:59:03.917332 systemd-modules-load[300]: Inserted module 'overlay' Sep 4 00:59:03.935336 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 00:59:03.971596 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 00:59:03.971612 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 00:59:03.971621 kernel: Bridge firewalling registered Sep 4 00:59:03.940731 systemd-modules-load[300]: Inserted module 'br_netfilter' Sep 4 00:59:03.971663 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:59:04.006961 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 00:59:04.031983 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 00:59:04.042031 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:59:04.065747 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 00:59:04.091451 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 00:59:04.105269 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 00:59:04.110606 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 00:59:04.116518 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:59:04.116760 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 00:59:04.118143 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 00:59:04.119868 systemd-tmpfiles[320]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 00:59:04.123787 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:59:04.125785 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 00:59:04.130155 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 00:59:04.139885 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:59:04.149073 systemd-resolved[339]: Positive Trust Anchors: Sep 4 00:59:04.149078 systemd-resolved[339]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 00:59:04.149100 systemd-resolved[339]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 00:59:04.150676 systemd-resolved[339]: Defaulting to hostname 'linux'. Sep 4 00:59:04.172952 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 00:59:04.188068 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:59:04.201548 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 00:59:04.305436 dracut-cmdline[345]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:59:04.390724 kernel: SCSI subsystem initialized Sep 4 00:59:04.403673 kernel: Loading iSCSI transport class v2.0-870. Sep 4 00:59:04.416745 kernel: iscsi: registered transport (tcp) Sep 4 00:59:04.440162 kernel: iscsi: registered transport (qla4xxx) Sep 4 00:59:04.440185 kernel: QLogic iSCSI HBA Driver Sep 4 00:59:04.450242 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 00:59:04.491397 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:59:04.506450 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 00:59:04.629715 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 00:59:04.642571 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 00:59:04.762715 kernel: raid6: avx2x4 gen() 17304 MB/s Sep 4 00:59:04.783717 kernel: raid6: avx2x2 gen() 39197 MB/s Sep 4 00:59:04.809785 kernel: raid6: avx2x1 gen() 46261 MB/s Sep 4 00:59:04.809800 kernel: raid6: using algorithm avx2x1 gen() 46261 MB/s Sep 4 00:59:04.836879 kernel: raid6: .... xor() 24938 MB/s, rmw enabled Sep 4 00:59:04.836895 kernel: raid6: using avx2x2 recovery algorithm Sep 4 00:59:04.857675 kernel: xor: automatically using best checksumming function avx Sep 4 00:59:04.961705 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 00:59:04.965064 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 00:59:04.975832 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:59:05.023371 systemd-udevd[558]: Using default interface naming scheme 'v255'. Sep 4 00:59:05.026657 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:59:05.052390 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 00:59:05.089770 dracut-pre-trigger[569]: rd.md=0: removing MD RAID activation Sep 4 00:59:05.103937 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 00:59:05.115964 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 00:59:05.248101 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:59:05.266681 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 00:59:05.266883 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 4 00:59:05.283326 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 4 00:59:05.289676 kernel: libata version 3.00 loaded. Sep 4 00:59:05.291810 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 00:59:05.330759 kernel: PTP clock support registered Sep 4 00:59:05.330778 kernel: ACPI: bus type USB registered Sep 4 00:59:05.330785 kernel: usbcore: registered new interface driver usbfs Sep 4 00:59:05.330794 kernel: usbcore: registered new interface driver hub Sep 4 00:59:05.330802 kernel: usbcore: registered new device driver usb Sep 4 00:59:05.330809 kernel: AES CTR mode by8 optimization enabled Sep 4 00:59:05.333675 kernel: ahci 0000:00:17.0: version 3.0 Sep 4 00:59:05.338718 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 00:59:05.370378 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Sep 4 00:59:05.370493 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Sep 4 00:59:05.370559 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 4 00:59:05.338851 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:59:05.566816 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 4 00:59:05.566916 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 4 00:59:05.566989 kernel: scsi host0: ahci Sep 4 00:59:05.567062 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 4 00:59:05.567151 kernel: scsi host1: ahci Sep 4 00:59:05.567226 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 4 00:59:05.567312 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 4 00:59:05.567324 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 4 00:59:05.567402 kernel: scsi host2: ahci Sep 4 00:59:05.567483 kernel: scsi host3: ahci Sep 4 00:59:05.567564 kernel: scsi host4: ahci Sep 4 00:59:05.567648 kernel: scsi host5: ahci Sep 4 00:59:05.567740 kernel: scsi host6: ahci Sep 4 00:59:05.567827 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Sep 4 00:59:05.567838 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Sep 4 00:59:05.567849 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Sep 4 00:59:05.567860 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Sep 4 00:59:05.567870 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Sep 4 00:59:05.567880 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Sep 4 00:59:05.567889 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Sep 4 00:59:05.567900 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 4 00:59:05.567911 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 4 00:59:05.568002 kernel: igb 0000:03:00.0: added PHC on eth0 Sep 4 00:59:05.568081 kernel: hub 1-0:1.0: USB hub found Sep 4 00:59:05.568171 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 4 00:59:05.568264 kernel: hub 1-0:1.0: 16 ports detected Sep 4 00:59:05.568348 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:ef:e6 Sep 4 00:59:05.568438 kernel: hub 2-0:1.0: USB hub found Sep 4 00:59:05.568534 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Sep 4 00:59:05.568625 kernel: hub 2-0:1.0: 10 ports detected Sep 4 00:59:05.568717 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 4 00:59:05.465532 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:59:05.584347 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:59:05.640798 kernel: igb 0000:04:00.0: added PHC on eth1 Sep 4 00:59:05.640888 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 4 00:59:05.640958 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:ef:e7 Sep 4 00:59:05.641024 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Sep 4 00:59:05.641089 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 4 00:59:05.593159 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:59:05.701746 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Sep 4 00:59:05.701843 kernel: mlx5_core 0000:01:00.0: firmware version: 14.31.1014 Sep 4 00:59:05.701910 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 4 00:59:05.722736 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 4 00:59:05.722753 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 4 00:59:05.722761 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 4 00:59:05.722768 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 4 00:59:05.722775 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 4 00:59:05.722785 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 4 00:59:05.722791 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 4 00:59:05.722798 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 4 00:59:05.722805 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 4 00:59:05.722812 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Sep 4 00:59:05.722819 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Sep 4 00:59:05.726725 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 4 00:59:05.726742 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 4 00:59:05.733708 kernel: ata1.00: Features: NCQ-prio Sep 4 00:59:05.734717 kernel: ata2.00: Features: NCQ-prio Sep 4 00:59:05.737706 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 4 00:59:05.844709 kernel: ata1.00: configured for UDMA/133 Sep 4 00:59:05.844728 kernel: ata2.00: configured for UDMA/133 Sep 4 00:59:05.844750 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 4 00:59:05.857728 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Sep 4 00:59:05.857667 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:59:06.028453 kernel: hub 1-14:1.0: USB hub found Sep 4 00:59:06.028564 kernel: hub 1-14:1.0: 4 ports detected Sep 4 00:59:06.028639 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Sep 4 00:59:06.028714 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:59:06.028723 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Sep 4 00:59:06.028788 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 4 00:59:06.028859 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:59:06.028869 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 4 00:59:06.028936 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Sep 4 00:59:06.028997 kernel: sd 0:0:0:0: [sdb] Write Protect is off Sep 4 00:59:06.029054 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 4 00:59:06.029112 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 4 00:59:06.029168 kernel: sd 0:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 4 00:59:06.029226 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:59:06.029233 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Sep 4 00:59:06.029291 kernel: sd 1:0:0:0: [sda] Write Protect is off Sep 4 00:59:06.029350 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 4 00:59:06.029408 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 4 00:59:06.029465 kernel: sd 1:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 4 00:59:06.029522 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:59:06.029529 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Sep 4 00:59:06.029595 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 00:59:06.029605 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Sep 4 00:59:06.029666 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Sep 4 00:59:06.029734 kernel: GPT:9289727 != 937703087 Sep 4 00:59:06.029742 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 00:59:06.029748 kernel: GPT:9289727 != 937703087 Sep 4 00:59:06.029754 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 00:59:06.029761 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:59:06.045623 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Sep 4 00:59:06.077793 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Sep 4 00:59:06.098887 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Sep 4 00:59:06.110202 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 4 00:59:06.135388 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 4 00:59:06.170853 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 4 00:59:06.149902 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Sep 4 00:59:06.181337 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 00:59:06.230660 disk-uuid[773]: Primary Header is updated. Sep 4 00:59:06.230660 disk-uuid[773]: Secondary Entries is updated. Sep 4 00:59:06.230660 disk-uuid[773]: Secondary Header is updated. Sep 4 00:59:06.289772 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 4 00:59:06.289867 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:59:06.289876 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Sep 4 00:59:06.289954 kernel: mlx5_core 0000:01:00.1: firmware version: 14.31.1014 Sep 4 00:59:06.290037 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 4 00:59:06.290098 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:59:06.290106 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:59:06.296715 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:59:06.312734 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 4 00:59:06.325052 kernel: usbcore: registered new interface driver usbhid Sep 4 00:59:06.325074 kernel: usbhid: USB HID core driver Sep 4 00:59:06.339677 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 4 00:59:06.413635 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 4 00:59:06.413805 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 4 00:59:06.425551 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 4 00:59:06.567695 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Sep 4 00:59:06.580880 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Sep 4 00:59:06.872716 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 4 00:59:06.882708 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Sep 4 00:59:06.882955 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Sep 4 00:59:06.897663 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 00:59:06.916144 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 00:59:06.926855 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:59:06.947776 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 00:59:06.969129 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 00:59:07.030207 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 00:59:07.276677 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:59:07.288984 disk-uuid[774]: The operation has completed successfully. Sep 4 00:59:07.295770 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:59:07.325334 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 00:59:07.325381 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 00:59:07.359940 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 00:59:07.390297 sh[826]: Success Sep 4 00:59:07.422492 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 00:59:07.422511 kernel: device-mapper: uevent: version 1.0.3 Sep 4 00:59:07.431723 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 00:59:07.443723 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 4 00:59:07.489121 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 00:59:07.499011 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 00:59:07.530866 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 00:59:07.578769 kernel: BTRFS: device fsid 8a9c2e34-3d3c-49a9-acce-59bf90003071 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (838) Sep 4 00:59:07.578784 kernel: BTRFS info (device dm-0): first mount of filesystem 8a9c2e34-3d3c-49a9-acce-59bf90003071 Sep 4 00:59:07.578792 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:59:07.594691 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 4 00:59:07.594709 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 00:59:07.600792 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 00:59:07.603066 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 00:59:07.611052 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 00:59:07.634928 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 00:59:07.635411 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 00:59:07.651352 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 00:59:07.707674 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sdb6 (8:22) scanned by mount (861) Sep 4 00:59:07.725365 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:59:07.725383 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:59:07.740427 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:59:07.740446 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:59:07.746532 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:59:07.755395 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 00:59:07.781066 kernel: BTRFS info (device sdb6): last unmount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:59:07.771389 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 00:59:07.793394 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 00:59:07.811365 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 00:59:07.852194 systemd-networkd[1008]: lo: Link UP Sep 4 00:59:07.852197 systemd-networkd[1008]: lo: Gained carrier Sep 4 00:59:07.854601 systemd-networkd[1008]: Enumeration completed Sep 4 00:59:07.854676 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 00:59:07.855285 systemd-networkd[1008]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:59:07.871794 systemd[1]: Reached target network.target - Network. Sep 4 00:59:07.883724 systemd-networkd[1008]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:59:07.911203 systemd-networkd[1008]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:59:07.933133 ignition[1007]: Ignition 2.21.0 Sep 4 00:59:07.933140 ignition[1007]: Stage: fetch-offline Sep 4 00:59:07.936203 unknown[1007]: fetched base config from "system" Sep 4 00:59:07.933164 ignition[1007]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:07.936207 unknown[1007]: fetched user config from "system" Sep 4 00:59:07.933171 ignition[1007]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:07.937439 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 00:59:07.933253 ignition[1007]: parsed url from cmdline: "" Sep 4 00:59:07.952016 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 00:59:07.933256 ignition[1007]: no config URL provided Sep 4 00:59:07.952461 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 00:59:07.933260 ignition[1007]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 00:59:07.933296 ignition[1007]: parsing config with SHA512: 64daf4bfb336a6af7f517fc769d145cf54e507fb33674f5a8cb41524b9f5ff750d72afef6ba06eb26f4ada158a06e4aa101eb23f9588a99c1bab0550c74d419c Sep 4 00:59:07.936388 ignition[1007]: fetch-offline: fetch-offline passed Sep 4 00:59:07.936392 ignition[1007]: POST message to Packet Timeline Sep 4 00:59:07.936396 ignition[1007]: POST Status error: resource requires networking Sep 4 00:59:07.936427 ignition[1007]: Ignition finished successfully Sep 4 00:59:07.991398 ignition[1024]: Ignition 2.21.0 Sep 4 00:59:07.991402 ignition[1024]: Stage: kargs Sep 4 00:59:07.991491 ignition[1024]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:08.097402 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 4 00:59:07.991496 ignition[1024]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:07.992223 ignition[1024]: kargs: kargs passed Sep 4 00:59:08.098229 systemd-networkd[1008]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:59:07.992227 ignition[1024]: POST message to Packet Timeline Sep 4 00:59:07.992240 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:59:07.992819 ignition[1024]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:44015->[::1]:53: read: connection refused Sep 4 00:59:08.193022 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #2 Sep 4 00:59:08.194074 ignition[1024]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:35253->[::1]:53: read: connection refused Sep 4 00:59:08.325787 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 4 00:59:08.326976 systemd-networkd[1008]: eno1: Link UP Sep 4 00:59:08.327148 systemd-networkd[1008]: eno2: Link UP Sep 4 00:59:08.327312 systemd-networkd[1008]: enp1s0f0np0: Link UP Sep 4 00:59:08.327499 systemd-networkd[1008]: enp1s0f0np0: Gained carrier Sep 4 00:59:08.341201 systemd-networkd[1008]: enp1s0f1np1: Link UP Sep 4 00:59:08.342520 systemd-networkd[1008]: enp1s0f1np1: Gained carrier Sep 4 00:59:08.378856 systemd-networkd[1008]: enp1s0f0np0: DHCPv4 address 147.75.202.229/31, gateway 147.75.202.228 acquired from 145.40.83.140 Sep 4 00:59:08.594483 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #3 Sep 4 00:59:08.595731 ignition[1024]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:44408->[::1]:53: read: connection refused Sep 4 00:59:09.395983 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #4 Sep 4 00:59:09.397273 ignition[1024]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:42771->[::1]:53: read: connection refused Sep 4 00:59:09.565988 systemd-networkd[1008]: enp1s0f0np0: Gained IPv6LL Sep 4 00:59:10.077981 systemd-networkd[1008]: enp1s0f1np1: Gained IPv6LL Sep 4 00:59:10.999002 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #5 Sep 4 00:59:11.000179 ignition[1024]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:56122->[::1]:53: read: connection refused Sep 4 00:59:14.203416 ignition[1024]: GET https://metadata.packet.net/metadata: attempt #6 Sep 4 00:59:15.216488 ignition[1024]: GET result: OK Sep 4 00:59:15.614894 ignition[1024]: Ignition finished successfully Sep 4 00:59:15.620630 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 00:59:15.631562 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 00:59:15.675762 ignition[1043]: Ignition 2.21.0 Sep 4 00:59:15.675767 ignition[1043]: Stage: disks Sep 4 00:59:15.675852 ignition[1043]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:15.675858 ignition[1043]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:15.676461 ignition[1043]: disks: disks passed Sep 4 00:59:15.676464 ignition[1043]: POST message to Packet Timeline Sep 4 00:59:15.676484 ignition[1043]: GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:59:16.763906 ignition[1043]: GET result: OK Sep 4 00:59:17.246547 ignition[1043]: Ignition finished successfully Sep 4 00:59:17.251364 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 00:59:17.263913 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 00:59:17.282006 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 00:59:17.301082 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 00:59:17.311236 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 00:59:17.327229 systemd[1]: Reached target basic.target - Basic System. Sep 4 00:59:17.345964 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 00:59:17.400232 systemd-fsck[1063]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 4 00:59:17.409203 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 00:59:17.424285 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 00:59:17.529726 kernel: EXT4-fs (sdb9): mounted filesystem c3518c93-f823-4477-a620-ff9666a59be5 r/w with ordered data mode. Quota mode: none. Sep 4 00:59:17.530263 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 00:59:17.539176 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 00:59:17.546135 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 00:59:17.562842 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 00:59:17.577005 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 4 00:59:17.599679 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 4 00:59:17.662858 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1073) Sep 4 00:59:17.662872 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:59:17.662880 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:59:17.662887 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:59:17.662893 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:59:17.662900 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:59:17.662797 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 00:59:17.662819 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 00:59:17.715853 coreos-metadata[1075]: Sep 04 00:59:17.688 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:59:17.681753 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 00:59:17.734933 coreos-metadata[1076]: Sep 04 00:59:17.688 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:59:17.705909 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 00:59:17.723912 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 00:59:17.798466 initrd-setup-root[1105]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 00:59:17.807748 initrd-setup-root[1112]: cut: /sysroot/etc/group: No such file or directory Sep 4 00:59:17.817787 initrd-setup-root[1119]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 00:59:17.826771 initrd-setup-root[1126]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 00:59:17.865883 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 00:59:17.875518 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 00:59:17.884429 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 00:59:17.921691 kernel: BTRFS info (device sdb6): last unmount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:59:17.924096 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 00:59:17.948364 ignition[1193]: INFO : Ignition 2.21.0 Sep 4 00:59:17.948364 ignition[1193]: INFO : Stage: mount Sep 4 00:59:17.959783 ignition[1193]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:17.959783 ignition[1193]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:17.959783 ignition[1193]: INFO : mount: mount passed Sep 4 00:59:17.959783 ignition[1193]: INFO : POST message to Packet Timeline Sep 4 00:59:17.959783 ignition[1193]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:59:17.955817 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 00:59:18.774450 coreos-metadata[1075]: Sep 04 00:59:18.774 INFO Fetch successful Sep 4 00:59:18.802803 coreos-metadata[1076]: Sep 04 00:59:18.802 INFO Fetch successful Sep 4 00:59:18.835510 ignition[1193]: INFO : GET result: OK Sep 4 00:59:18.851415 coreos-metadata[1075]: Sep 04 00:59:18.851 INFO wrote hostname ci-4372.1.0-n-d84d506c1c to /sysroot/etc/hostname Sep 4 00:59:18.852714 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 4 00:59:18.876050 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 4 00:59:18.876093 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 4 00:59:19.240423 ignition[1193]: INFO : Ignition finished successfully Sep 4 00:59:19.244306 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 00:59:19.261883 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 00:59:19.288094 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 00:59:19.334715 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1217) Sep 4 00:59:19.352661 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:59:19.352686 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:59:19.368301 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:59:19.368317 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:59:19.374414 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:59:19.376207 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 00:59:19.406117 ignition[1234]: INFO : Ignition 2.21.0 Sep 4 00:59:19.406117 ignition[1234]: INFO : Stage: files Sep 4 00:59:19.419929 ignition[1234]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:19.419929 ignition[1234]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:19.419929 ignition[1234]: DEBUG : files: compiled without relabeling support, skipping Sep 4 00:59:19.419929 ignition[1234]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 00:59:19.419929 ignition[1234]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 00:59:19.419929 ignition[1234]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 00:59:19.419929 ignition[1234]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 00:59:19.419929 ignition[1234]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 00:59:19.419929 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 4 00:59:19.419929 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 4 00:59:19.409613 unknown[1234]: wrote ssh authorized keys file for user: core Sep 4 00:59:19.542946 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 00:59:20.154581 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 00:59:20.170779 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 4 00:59:20.833551 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 00:59:22.094113 ignition[1234]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 4 00:59:22.094113 ignition[1234]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 00:59:22.121961 ignition[1234]: INFO : files: files passed Sep 4 00:59:22.121961 ignition[1234]: INFO : POST message to Packet Timeline Sep 4 00:59:22.121961 ignition[1234]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:59:22.962297 ignition[1234]: INFO : GET result: OK Sep 4 00:59:23.361837 ignition[1234]: INFO : Ignition finished successfully Sep 4 00:59:23.366167 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 00:59:23.382997 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 00:59:23.390379 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 00:59:23.416114 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 00:59:23.416171 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 00:59:23.445019 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 00:59:23.464097 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 00:59:23.493835 initrd-setup-root-after-ignition[1271]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:59:23.493835 initrd-setup-root-after-ignition[1271]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:59:23.484664 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 00:59:23.541003 initrd-setup-root-after-ignition[1275]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:59:23.605766 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 00:59:23.605818 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 00:59:23.622962 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 00:59:23.633915 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 00:59:23.660073 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 00:59:23.661951 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 00:59:23.743456 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 00:59:23.757462 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 00:59:23.803993 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:59:23.814890 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:59:23.823986 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 00:59:23.851070 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 00:59:23.851258 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 00:59:23.887162 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 00:59:23.896278 systemd[1]: Stopped target basic.target - Basic System. Sep 4 00:59:23.914284 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 00:59:23.931270 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 00:59:23.950258 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 00:59:23.969269 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 00:59:23.988287 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 00:59:24.006366 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 00:59:24.025419 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 00:59:24.044312 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 00:59:24.062248 systemd[1]: Stopped target swap.target - Swaps. Sep 4 00:59:24.079188 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 00:59:24.079588 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 00:59:24.113113 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:59:24.122314 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:59:24.141158 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 00:59:24.141576 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:59:24.161112 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 00:59:24.161510 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 00:59:24.191325 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 00:59:24.191804 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 00:59:24.209443 systemd[1]: Stopped target paths.target - Path Units. Sep 4 00:59:24.225155 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 00:59:24.228985 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:59:24.245271 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 00:59:24.262265 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 00:59:24.279245 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 00:59:24.279523 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 00:59:24.297299 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 00:59:24.297587 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 00:59:24.318381 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 00:59:24.318805 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 00:59:24.335355 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 00:59:24.452884 ignition[1296]: INFO : Ignition 2.21.0 Sep 4 00:59:24.452884 ignition[1296]: INFO : Stage: umount Sep 4 00:59:24.452884 ignition[1296]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:59:24.452884 ignition[1296]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:59:24.452884 ignition[1296]: INFO : umount: umount passed Sep 4 00:59:24.452884 ignition[1296]: INFO : POST message to Packet Timeline Sep 4 00:59:24.452884 ignition[1296]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:59:24.335769 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 00:59:24.351259 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 4 00:59:24.351625 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 4 00:59:24.370718 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 00:59:24.382896 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 00:59:24.382970 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:59:24.411469 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 00:59:24.419869 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 00:59:24.420027 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:59:24.445903 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 00:59:24.445978 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 00:59:24.485042 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 00:59:24.485787 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 00:59:24.485859 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 00:59:24.498742 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 00:59:24.498999 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 00:59:25.332525 ignition[1296]: INFO : GET result: OK Sep 4 00:59:25.715099 ignition[1296]: INFO : Ignition finished successfully Sep 4 00:59:25.718879 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 00:59:25.719169 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 00:59:25.732789 systemd[1]: Stopped target network.target - Network. Sep 4 00:59:25.745989 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 00:59:25.746165 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 00:59:25.763005 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 00:59:25.763152 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 00:59:25.779085 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 00:59:25.779269 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 00:59:25.797089 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 00:59:25.797258 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 00:59:25.815060 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 00:59:25.815250 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 00:59:25.833454 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 00:59:25.850127 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 00:59:25.868823 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 00:59:25.869101 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 00:59:25.891090 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 00:59:25.891650 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 00:59:25.891941 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 00:59:25.907579 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 00:59:25.909750 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 00:59:25.922129 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 00:59:25.922253 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:59:25.944304 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 00:59:25.957869 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 00:59:25.958113 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 00:59:25.967280 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 00:59:25.967430 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:59:25.994368 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 00:59:25.994503 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 00:59:26.012153 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 00:59:26.012334 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:59:26.031291 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:59:26.054253 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 00:59:26.054441 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:59:26.055516 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 00:59:26.055903 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:59:26.073200 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 00:59:26.073371 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 00:59:26.078990 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 00:59:26.079009 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:59:26.106939 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 00:59:26.106979 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 00:59:26.150887 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 00:59:26.151034 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 00:59:26.187869 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 00:59:26.188026 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 00:59:26.227898 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 00:59:26.253833 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 00:59:26.254003 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:59:26.264355 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 00:59:26.264512 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:59:26.516841 systemd-journald[297]: Received SIGTERM from PID 1 (systemd). Sep 4 00:59:26.296333 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 00:59:26.296472 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:59:26.320574 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 4 00:59:26.320756 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 4 00:59:26.320885 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:59:26.322222 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 00:59:26.322533 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 00:59:26.387415 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 00:59:26.387724 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 00:59:26.398957 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 00:59:26.419175 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 00:59:26.467846 systemd[1]: Switching root. Sep 4 00:59:26.628906 systemd-journald[297]: Journal stopped Sep 4 00:59:28.383005 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 00:59:28.383020 kernel: SELinux: policy capability open_perms=1 Sep 4 00:59:28.383028 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 00:59:28.383034 kernel: SELinux: policy capability always_check_network=0 Sep 4 00:59:28.383039 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 00:59:28.383044 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 00:59:28.383050 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 00:59:28.383055 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 00:59:28.383060 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 00:59:28.383067 kernel: audit: type=1403 audit(1756947566.777:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 00:59:28.383073 systemd[1]: Successfully loaded SELinux policy in 85.870ms. Sep 4 00:59:28.383080 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.350ms. Sep 4 00:59:28.383087 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 00:59:28.383093 systemd[1]: Detected architecture x86-64. Sep 4 00:59:28.383101 systemd[1]: Detected first boot. Sep 4 00:59:28.383107 systemd[1]: Hostname set to . Sep 4 00:59:28.383113 systemd[1]: Initializing machine ID from random generator. Sep 4 00:59:28.383119 zram_generator::config[1349]: No configuration found. Sep 4 00:59:28.383126 systemd[1]: Populated /etc with preset unit settings. Sep 4 00:59:28.383132 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 00:59:28.383140 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 00:59:28.383146 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 00:59:28.383152 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 00:59:28.383158 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 00:59:28.383164 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 00:59:28.383171 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 00:59:28.383177 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 00:59:28.383184 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 00:59:28.383191 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 00:59:28.383197 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 00:59:28.383204 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 00:59:28.383210 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:59:28.383216 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:59:28.383224 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 00:59:28.383230 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 00:59:28.383238 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 00:59:28.383245 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 00:59:28.383251 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 4 00:59:28.383257 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:59:28.383263 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:59:28.383271 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 00:59:28.383277 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 00:59:28.383284 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 00:59:28.383291 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 00:59:28.383298 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:59:28.383304 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 00:59:28.383311 systemd[1]: Reached target slices.target - Slice Units. Sep 4 00:59:28.383317 systemd[1]: Reached target swap.target - Swaps. Sep 4 00:59:28.383323 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 00:59:28.383330 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 00:59:28.383336 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 00:59:28.383344 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:59:28.383350 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 00:59:28.383357 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:59:28.383363 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 00:59:28.383370 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 00:59:28.383378 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 00:59:28.383384 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 00:59:28.383391 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:59:28.383398 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 00:59:28.383404 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 00:59:28.383410 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 00:59:28.383417 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 00:59:28.383424 systemd[1]: Reached target machines.target - Containers. Sep 4 00:59:28.383431 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 00:59:28.383438 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 00:59:28.383445 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 00:59:28.383451 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 00:59:28.383458 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 00:59:28.383464 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 00:59:28.383471 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 00:59:28.383477 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 00:59:28.383484 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 00:59:28.383491 kernel: ACPI: bus type drm_connector registered Sep 4 00:59:28.383497 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 00:59:28.383505 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 00:59:28.383511 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 00:59:28.383518 kernel: loop: module loaded Sep 4 00:59:28.383524 kernel: fuse: init (API version 7.41) Sep 4 00:59:28.383530 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 00:59:28.383536 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 00:59:28.383544 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 00:59:28.383551 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 00:59:28.383557 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 00:59:28.383573 systemd-journald[1452]: Collecting audit messages is disabled. Sep 4 00:59:28.383589 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 00:59:28.383596 systemd-journald[1452]: Journal started Sep 4 00:59:28.383610 systemd-journald[1452]: Runtime Journal (/run/log/journal/167739f99d2b4d0e8c5bb786f62bc46c) is 8M, max 640.1M, 632.1M free. Sep 4 00:59:27.258786 systemd[1]: Queued start job for default target multi-user.target. Sep 4 00:59:27.274599 systemd[1]: Unnecessary job was removed for dev-sdb6.device - /dev/sdb6. Sep 4 00:59:27.274864 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 00:59:28.413727 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 00:59:28.434723 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 00:59:28.454706 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 00:59:28.474878 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 00:59:28.474908 systemd[1]: Stopped verity-setup.service. Sep 4 00:59:28.498731 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:59:28.506712 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 00:59:28.515158 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 00:59:28.523986 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 00:59:28.532967 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 00:59:28.541957 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 00:59:28.550955 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 00:59:28.559949 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 00:59:28.569017 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 00:59:28.579033 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:59:28.589042 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 00:59:28.589197 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 00:59:28.599133 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 00:59:28.599336 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 00:59:28.610255 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 00:59:28.610533 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 00:59:28.620532 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 00:59:28.621004 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 00:59:28.632629 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 00:59:28.633345 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 00:59:28.642770 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 00:59:28.643251 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 00:59:28.652756 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 00:59:28.663665 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:59:28.674668 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 00:59:28.685684 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 00:59:28.696708 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:59:28.731324 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 00:59:28.745330 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 00:59:28.769339 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 00:59:28.778924 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 00:59:28.778942 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 00:59:28.779515 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 00:59:28.800562 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 00:59:28.809259 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 00:59:28.820601 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 00:59:28.842936 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 00:59:28.852907 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 00:59:28.867979 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 00:59:28.870489 systemd-journald[1452]: Time spent on flushing to /var/log/journal/167739f99d2b4d0e8c5bb786f62bc46c is 12.066ms for 1392 entries. Sep 4 00:59:28.870489 systemd-journald[1452]: System Journal (/var/log/journal/167739f99d2b4d0e8c5bb786f62bc46c) is 8M, max 195.6M, 187.6M free. Sep 4 00:59:28.893325 systemd-journald[1452]: Received client request to flush runtime journal. Sep 4 00:59:28.884784 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 00:59:28.894248 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 00:59:28.903418 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 00:59:28.915381 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 00:59:28.927733 kernel: loop0: detected capacity change from 0 to 229808 Sep 4 00:59:28.930992 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 00:59:28.942313 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 00:59:28.948678 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 00:59:28.958952 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 00:59:28.968935 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 00:59:28.978898 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:59:28.990714 kernel: loop1: detected capacity change from 0 to 8 Sep 4 00:59:28.992924 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 00:59:29.003246 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 00:59:29.014466 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 00:59:29.026705 kernel: loop2: detected capacity change from 0 to 146240 Sep 4 00:59:29.040963 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 00:59:29.058661 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 00:59:29.059070 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 00:59:29.073873 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 4 00:59:29.073882 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 4 00:59:29.076175 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:59:29.094728 kernel: loop3: detected capacity change from 0 to 113872 Sep 4 00:59:29.141681 kernel: loop4: detected capacity change from 0 to 229808 Sep 4 00:59:29.163722 kernel: loop5: detected capacity change from 0 to 8 Sep 4 00:59:29.170676 kernel: loop6: detected capacity change from 0 to 146240 Sep 4 00:59:29.193713 kernel: loop7: detected capacity change from 0 to 113872 Sep 4 00:59:29.203681 (sd-merge)[1509]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 4 00:59:29.203926 (sd-merge)[1509]: Merged extensions into '/usr'. Sep 4 00:59:29.206547 systemd[1]: Reload requested from client PID 1488 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 00:59:29.206554 systemd[1]: Reloading... Sep 4 00:59:29.234683 zram_generator::config[1535]: No configuration found. Sep 4 00:59:29.243619 ldconfig[1482]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 00:59:29.293489 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:59:29.354885 systemd[1]: Reloading finished in 148 ms. Sep 4 00:59:29.373521 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 00:59:29.382746 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 00:59:29.393725 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 00:59:29.428963 systemd[1]: Starting ensure-sysext.service... Sep 4 00:59:29.435356 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 00:59:29.446436 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:59:29.453580 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 00:59:29.453599 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 00:59:29.453762 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 00:59:29.453927 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 00:59:29.454442 systemd-tmpfiles[1593]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 00:59:29.454617 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 4 00:59:29.454654 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. Sep 4 00:59:29.456968 systemd-tmpfiles[1593]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 00:59:29.456973 systemd-tmpfiles[1593]: Skipping /boot Sep 4 00:59:29.461156 systemd[1]: Reload requested from client PID 1592 ('systemctl') (unit ensure-sysext.service)... Sep 4 00:59:29.461164 systemd[1]: Reloading... Sep 4 00:59:29.463212 systemd-tmpfiles[1593]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 00:59:29.463217 systemd-tmpfiles[1593]: Skipping /boot Sep 4 00:59:29.474982 systemd-udevd[1594]: Using default interface naming scheme 'v255'. Sep 4 00:59:29.486742 zram_generator::config[1621]: No configuration found. Sep 4 00:59:29.536992 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 4 00:59:29.537062 kernel: ACPI: button: Sleep Button [SLPB] Sep 4 00:59:29.544748 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 00:59:29.553679 kernel: ACPI: button: Power Button [PWRF] Sep 4 00:59:29.563680 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 4 00:59:29.578804 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 4 00:59:29.580278 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 00:59:29.580295 kernel: IPMI message handler: version 39.2 Sep 4 00:59:29.565132 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:59:29.595689 kernel: ipmi device interface Sep 4 00:59:29.605687 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 4 00:59:29.612683 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 4 00:59:29.645519 kernel: ipmi_si: IPMI System Interface driver Sep 4 00:59:29.645577 kernel: MACsec IEEE 802.1AE Sep 4 00:59:29.645595 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 4 00:59:29.656087 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 4 00:59:29.656677 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 4 00:59:29.668765 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 4 00:59:29.677051 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 4 00:59:29.677675 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 4 00:59:29.682252 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 4 00:59:29.682336 systemd[1]: Reloading finished in 220 ms. Sep 4 00:59:29.692274 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 4 00:59:29.702469 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 4 00:59:29.715933 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:59:29.719675 kernel: iTCO_vendor_support: vendor-support=0 Sep 4 00:59:29.756709 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Sep 4 00:59:29.761730 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:59:29.764679 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Sep 4 00:59:29.785387 systemd[1]: Finished ensure-sysext.service. Sep 4 00:59:29.787677 kernel: intel_rapl_common: Found RAPL domain package Sep 4 00:59:29.787718 kernel: intel_rapl_common: Found RAPL domain core Sep 4 00:59:29.787730 kernel: intel_rapl_common: Found RAPL domain dram Sep 4 00:59:29.787740 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 4 00:59:29.842792 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Sep 4 00:59:29.875425 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Sep 4 00:59:29.887067 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 4 00:59:29.895763 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:59:29.896465 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 00:59:29.911206 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 00:59:29.924676 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 4 00:59:29.931734 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 00:59:29.932676 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 4 00:59:29.932772 augenrules[1816]: No rules Sep 4 00:59:29.939903 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 00:59:29.957860 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 00:59:29.973823 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 00:59:29.998825 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 00:59:30.007778 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 00:59:30.008270 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 00:59:30.017706 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 00:59:30.018262 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 00:59:30.028562 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 00:59:30.029457 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 00:59:30.030346 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 00:59:30.044376 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 00:59:30.053331 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:59:30.070713 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:59:30.071280 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 00:59:30.074771 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 00:59:30.084943 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 00:59:30.085117 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 00:59:30.085244 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 00:59:30.085396 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 00:59:30.085484 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 00:59:30.085624 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 00:59:30.085718 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 00:59:30.085862 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 00:59:30.085947 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 00:59:30.086100 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 00:59:30.086259 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 00:59:30.089999 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 00:59:30.090060 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 00:59:30.090737 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 00:59:30.091558 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 00:59:30.091582 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 00:59:30.091799 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 00:59:30.097261 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 00:59:30.114651 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 00:59:30.155972 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 00:59:30.160150 systemd-resolved[1829]: Positive Trust Anchors: Sep 4 00:59:30.160156 systemd-resolved[1829]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 00:59:30.160178 systemd-resolved[1829]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 00:59:30.162744 systemd-resolved[1829]: Using system hostname 'ci-4372.1.0-n-d84d506c1c'. Sep 4 00:59:30.165117 systemd-networkd[1828]: lo: Link UP Sep 4 00:59:30.165119 systemd-networkd[1828]: lo: Gained carrier Sep 4 00:59:30.167684 systemd-networkd[1828]: bond0: netdev ready Sep 4 00:59:30.168642 systemd-networkd[1828]: Enumeration completed Sep 4 00:59:30.180043 systemd-networkd[1828]: enp1s0f0np0: Configuring with /etc/systemd/network/10-1c:34:da:42:bc:24.network. Sep 4 00:59:30.198042 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 00:59:30.206735 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 00:59:30.215877 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:59:30.226635 systemd[1]: Reached target network.target - Network. Sep 4 00:59:30.233740 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:59:30.243748 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 00:59:30.252791 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 00:59:30.262760 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 00:59:30.272746 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 00:59:30.282753 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 00:59:30.292744 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 00:59:30.292763 systemd[1]: Reached target paths.target - Path Units. Sep 4 00:59:30.299750 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 00:59:30.308834 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 00:59:30.317794 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 00:59:30.327745 systemd[1]: Reached target timers.target - Timer Units. Sep 4 00:59:30.335332 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 00:59:30.344388 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 00:59:30.352925 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 00:59:30.363660 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 00:59:30.371947 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 00:59:30.382440 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 00:59:30.393312 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 00:59:30.404022 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 00:59:30.413303 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 00:59:30.421798 systemd[1]: Reached target basic.target - Basic System. Sep 4 00:59:30.430814 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 00:59:30.430831 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 00:59:30.431394 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 00:59:30.448155 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 4 00:59:30.457272 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 00:59:30.465296 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 00:59:30.469898 coreos-metadata[1867]: Sep 04 00:59:30.469 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:59:30.470640 coreos-metadata[1867]: Sep 04 00:59:30.470 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Sep 4 00:59:30.475350 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 00:59:30.484320 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 00:59:30.486050 jq[1873]: false Sep 4 00:59:30.492787 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 00:59:30.493367 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 00:59:30.497494 extend-filesystems[1874]: Found /dev/sdb6 Sep 4 00:59:30.501816 extend-filesystems[1874]: Found /dev/sdb9 Sep 4 00:59:30.501816 extend-filesystems[1874]: Checking size of /dev/sdb9 Sep 4 00:59:30.526841 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Sep 4 00:59:30.525483 oslogin_cache_refresh[1875]: Refreshing passwd entry cache Sep 4 00:59:30.502351 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 00:59:30.527089 extend-filesystems[1874]: Resized partition /dev/sdb9 Sep 4 00:59:30.526579 oslogin_cache_refresh[1875]: Failure getting users, quitting Sep 4 00:59:30.515300 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 00:59:30.547854 extend-filesystems[1886]: resize2fs 1.47.2 (1-Jan-2025) Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Refreshing passwd entry cache Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Failure getting users, quitting Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Refreshing group entry cache Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Failure getting groups, quitting Sep 4 00:59:30.564770 google_oslogin_nss_cache[1875]: oslogin_cache_refresh[1875]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 00:59:30.526586 oslogin_cache_refresh[1875]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 00:59:30.541362 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 00:59:30.526605 oslogin_cache_refresh[1875]: Refreshing group entry cache Sep 4 00:59:30.548333 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 00:59:30.526911 oslogin_cache_refresh[1875]: Failure getting groups, quitting Sep 4 00:59:30.526917 oslogin_cache_refresh[1875]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 00:59:30.575056 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 00:59:30.584722 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 4 00:59:30.592977 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 00:59:30.593308 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 00:59:30.601200 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 00:59:30.607789 update_engine[1905]: I20250904 00:59:30.607721 1905 main.cc:92] Flatcar Update Engine starting Sep 4 00:59:30.610820 systemd-logind[1900]: Watching system buttons on /dev/input/event3 (Power Button) Sep 4 00:59:30.610831 systemd-logind[1900]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 4 00:59:30.610841 systemd-logind[1900]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 4 00:59:30.611081 systemd-logind[1900]: New seat seat0. Sep 4 00:59:30.612063 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 00:59:30.612929 jq[1906]: true Sep 4 00:59:30.620944 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 00:59:30.630936 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 00:59:30.631079 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 00:59:30.631263 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 00:59:30.631399 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 00:59:30.639882 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 00:59:30.640025 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 00:59:30.649261 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 00:59:30.649402 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 00:59:30.668482 jq[1910]: true Sep 4 00:59:30.668889 (ntainerd)[1911]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 00:59:30.678495 tar[1908]: linux-amd64/LICENSE Sep 4 00:59:30.678622 tar[1908]: linux-amd64/helm Sep 4 00:59:30.685851 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 4 00:59:30.685972 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 4 00:59:30.711468 dbus-daemon[1868]: [system] SELinux support is enabled Sep 4 00:59:30.711606 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 00:59:30.712306 bash[1937]: Updated "/home/core/.ssh/authorized_keys" Sep 4 00:59:30.713185 update_engine[1905]: I20250904 00:59:30.713121 1905 update_check_scheduler.cc:74] Next update check in 4m33s Sep 4 00:59:30.721301 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 00:59:30.733727 dbus-daemon[1868]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 4 00:59:30.734101 systemd[1]: Starting sshkeys.service... Sep 4 00:59:30.739747 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 00:59:30.739766 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 00:59:30.749746 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 00:59:30.749761 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 00:59:30.762993 systemd[1]: Started update-engine.service - Update Engine. Sep 4 00:59:30.773448 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 4 00:59:30.780330 sshd_keygen[1903]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 00:59:30.785588 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 4 00:59:30.816030 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 00:59:30.823875 containerd[1911]: time="2025-09-04T00:59:30Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 00:59:30.824211 containerd[1911]: time="2025-09-04T00:59:30.824192339Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 4 00:59:30.826645 coreos-metadata[1954]: Sep 04 00:59:30.826 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:59:30.827348 coreos-metadata[1954]: Sep 04 00:59:30.827 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Sep 4 00:59:30.829782 containerd[1911]: time="2025-09-04T00:59:30.829761539Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.829µs" Sep 4 00:59:30.829805 containerd[1911]: time="2025-09-04T00:59:30.829781440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 00:59:30.829805 containerd[1911]: time="2025-09-04T00:59:30.829793137Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 00:59:30.829884 containerd[1911]: time="2025-09-04T00:59:30.829874933Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 00:59:30.829901 containerd[1911]: time="2025-09-04T00:59:30.829886708Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 00:59:30.829919 containerd[1911]: time="2025-09-04T00:59:30.829908298Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 00:59:30.829949 containerd[1911]: time="2025-09-04T00:59:30.829940886Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 00:59:30.829967 containerd[1911]: time="2025-09-04T00:59:30.829948761Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830089 containerd[1911]: time="2025-09-04T00:59:30.830078847Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830106 containerd[1911]: time="2025-09-04T00:59:30.830090076Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830106 containerd[1911]: time="2025-09-04T00:59:30.830098642Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830106 containerd[1911]: time="2025-09-04T00:59:30.830103437Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830148 containerd[1911]: time="2025-09-04T00:59:30.830140900Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830423 containerd[1911]: time="2025-09-04T00:59:30.830405882Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830451 containerd[1911]: time="2025-09-04T00:59:30.830441978Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 00:59:30.830474 containerd[1911]: time="2025-09-04T00:59:30.830453822Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 00:59:30.830495 containerd[1911]: time="2025-09-04T00:59:30.830470873Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 00:59:30.830963 containerd[1911]: time="2025-09-04T00:59:30.830946979Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 00:59:30.831010 containerd[1911]: time="2025-09-04T00:59:30.831001548Z" level=info msg="metadata content store policy set" policy=shared Sep 4 00:59:30.834800 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 00:59:30.841159 containerd[1911]: time="2025-09-04T00:59:30.841143049Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 00:59:30.841189 containerd[1911]: time="2025-09-04T00:59:30.841170018Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 00:59:30.841189 containerd[1911]: time="2025-09-04T00:59:30.841179208Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 00:59:30.841216 containerd[1911]: time="2025-09-04T00:59:30.841192731Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 00:59:30.841216 containerd[1911]: time="2025-09-04T00:59:30.841200461Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 00:59:30.841216 containerd[1911]: time="2025-09-04T00:59:30.841206732Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 00:59:30.841216 containerd[1911]: time="2025-09-04T00:59:30.841213357Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 00:59:30.841278 containerd[1911]: time="2025-09-04T00:59:30.841220377Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 00:59:30.841278 containerd[1911]: time="2025-09-04T00:59:30.841226828Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 00:59:30.841278 containerd[1911]: time="2025-09-04T00:59:30.841232549Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 00:59:30.841278 containerd[1911]: time="2025-09-04T00:59:30.841241221Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 00:59:30.841278 containerd[1911]: time="2025-09-04T00:59:30.841250698Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 00:59:30.841346 containerd[1911]: time="2025-09-04T00:59:30.841320892Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 00:59:30.841346 containerd[1911]: time="2025-09-04T00:59:30.841342190Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 00:59:30.841372 containerd[1911]: time="2025-09-04T00:59:30.841351319Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 00:59:30.841372 containerd[1911]: time="2025-09-04T00:59:30.841357440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 00:59:30.841372 containerd[1911]: time="2025-09-04T00:59:30.841363231Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 00:59:30.841412 containerd[1911]: time="2025-09-04T00:59:30.841374830Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 00:59:30.841412 containerd[1911]: time="2025-09-04T00:59:30.841382028Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 00:59:30.841412 containerd[1911]: time="2025-09-04T00:59:30.841387714Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 00:59:30.841412 containerd[1911]: time="2025-09-04T00:59:30.841394196Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 00:59:30.841412 containerd[1911]: time="2025-09-04T00:59:30.841400509Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 00:59:30.841481 containerd[1911]: time="2025-09-04T00:59:30.841409930Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 00:59:30.841481 containerd[1911]: time="2025-09-04T00:59:30.841454103Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 00:59:30.841481 containerd[1911]: time="2025-09-04T00:59:30.841462584Z" level=info msg="Start snapshots syncer" Sep 4 00:59:30.841481 containerd[1911]: time="2025-09-04T00:59:30.841475359Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 00:59:30.841638 containerd[1911]: time="2025-09-04T00:59:30.841619881Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 00:59:30.841709 containerd[1911]: time="2025-09-04T00:59:30.841648448Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 00:59:30.841709 containerd[1911]: time="2025-09-04T00:59:30.841695993Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 00:59:30.841755 containerd[1911]: time="2025-09-04T00:59:30.841746359Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 00:59:30.841776 containerd[1911]: time="2025-09-04T00:59:30.841759230Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 00:59:30.841776 containerd[1911]: time="2025-09-04T00:59:30.841766828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 00:59:30.841776 containerd[1911]: time="2025-09-04T00:59:30.841774383Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 00:59:30.841818 containerd[1911]: time="2025-09-04T00:59:30.841781378Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 00:59:30.841818 containerd[1911]: time="2025-09-04T00:59:30.841787733Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 00:59:30.841818 containerd[1911]: time="2025-09-04T00:59:30.841794008Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 00:59:30.841818 containerd[1911]: time="2025-09-04T00:59:30.841806326Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 00:59:30.841818 containerd[1911]: time="2025-09-04T00:59:30.841812380Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841817975Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841835014Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841846660Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841852478Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841857698Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841865804Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841870873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 00:59:30.841882 containerd[1911]: time="2025-09-04T00:59:30.841876479Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 00:59:30.841987 containerd[1911]: time="2025-09-04T00:59:30.841886821Z" level=info msg="runtime interface created" Sep 4 00:59:30.841987 containerd[1911]: time="2025-09-04T00:59:30.841890256Z" level=info msg="created NRI interface" Sep 4 00:59:30.841987 containerd[1911]: time="2025-09-04T00:59:30.841894566Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 00:59:30.841987 containerd[1911]: time="2025-09-04T00:59:30.841902364Z" level=info msg="Connect containerd service" Sep 4 00:59:30.841987 containerd[1911]: time="2025-09-04T00:59:30.841917060Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 00:59:30.842456 containerd[1911]: time="2025-09-04T00:59:30.842436391Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 00:59:30.848868 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 00:59:30.855080 locksmithd[1957]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 00:59:30.870066 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 00:59:30.870191 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 00:59:30.880079 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 00:59:30.906544 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 00:59:30.917900 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 00:59:30.928632 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 4 00:59:30.937915 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 00:59:30.957005 containerd[1911]: time="2025-09-04T00:59:30.956980681Z" level=info msg="Start subscribing containerd event" Sep 4 00:59:30.957072 containerd[1911]: time="2025-09-04T00:59:30.957007841Z" level=info msg="Start recovering state" Sep 4 00:59:30.957072 containerd[1911]: time="2025-09-04T00:59:30.957055508Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 00:59:30.957072 containerd[1911]: time="2025-09-04T00:59:30.957068863Z" level=info msg="Start event monitor" Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957079922Z" level=info msg="Start cni network conf syncer for default" Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957086138Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957089409Z" level=info msg="Start streaming server" Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957098501Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957102868Z" level=info msg="runtime interface starting up..." Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957105680Z" level=info msg="starting plugins..." Sep 4 00:59:30.957152 containerd[1911]: time="2025-09-04T00:59:30.957114529Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 00:59:30.957305 containerd[1911]: time="2025-09-04T00:59:30.957184790Z" level=info msg="containerd successfully booted in 0.133519s" Sep 4 00:59:30.957233 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 00:59:30.960048 tar[1908]: linux-amd64/README.md Sep 4 00:59:30.983742 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 00:59:31.002711 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 4 00:59:31.014711 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Sep 4 00:59:31.015200 systemd-networkd[1828]: enp1s0f1np1: Configuring with /etc/systemd/network/10-1c:34:da:42:bc:25.network. Sep 4 00:59:31.057708 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Sep 4 00:59:31.084985 extend-filesystems[1886]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Sep 4 00:59:31.084985 extend-filesystems[1886]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 4 00:59:31.084985 extend-filesystems[1886]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Sep 4 00:59:31.113893 extend-filesystems[1874]: Resized filesystem in /dev/sdb9 Sep 4 00:59:31.085856 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 00:59:31.085994 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 00:59:31.210787 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 4 00:59:31.221691 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Sep 4 00:59:31.222299 systemd-networkd[1828]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 4 00:59:31.223818 systemd-networkd[1828]: enp1s0f0np0: Link UP Sep 4 00:59:31.224465 systemd-networkd[1828]: enp1s0f0np0: Gained carrier Sep 4 00:59:31.224600 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 00:59:31.232714 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 4 00:59:31.247070 systemd-networkd[1828]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-1c:34:da:42:bc:24.network. Sep 4 00:59:31.248081 systemd-networkd[1828]: enp1s0f1np1: Link UP Sep 4 00:59:31.248922 systemd-networkd[1828]: enp1s0f1np1: Gained carrier Sep 4 00:59:31.263370 systemd-networkd[1828]: bond0: Link UP Sep 4 00:59:31.264320 systemd-networkd[1828]: bond0: Gained carrier Sep 4 00:59:31.265098 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:31.266411 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:31.266632 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:31.266815 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:31.339031 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 4 00:59:31.339050 kernel: bond0: active interface up! Sep 4 00:59:31.455711 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 4 00:59:31.470785 coreos-metadata[1867]: Sep 04 00:59:31.470 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Sep 4 00:59:31.827533 coreos-metadata[1954]: Sep 04 00:59:31.827 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Sep 4 00:59:32.285950 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:32.477713 systemd-networkd[1828]: bond0: Gained IPv6LL Sep 4 00:59:32.477982 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:32.479057 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 00:59:32.489097 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 00:59:32.498927 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:59:32.515062 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 00:59:32.533992 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 00:59:33.270871 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:59:33.281374 (kubelet)[2026]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:59:33.730979 kubelet[2026]: E0904 00:59:33.730900 2026 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:59:33.732129 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:59:33.732210 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:59:33.732387 systemd[1]: kubelet.service: Consumed 610ms CPU time, 271M memory peak. Sep 4 00:59:34.901522 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Sep 4 00:59:34.901680 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Sep 4 00:59:34.922053 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 00:59:34.934778 systemd[1]: Started sshd@0-147.75.202.229:22-147.75.109.163:40716.service - OpenSSH per-connection server daemon (147.75.109.163:40716). Sep 4 00:59:35.002357 sshd[2047]: Accepted publickey for core from 147.75.109.163 port 40716 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:35.003323 sshd-session[2047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:35.010622 systemd-logind[1900]: New session 1 of user core. Sep 4 00:59:35.011533 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 00:59:35.020561 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 00:59:35.045699 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 00:59:35.057023 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 00:59:35.079935 (systemd)[2053]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 00:59:35.081449 systemd-logind[1900]: New session c1 of user core. Sep 4 00:59:35.178368 systemd[2053]: Queued start job for default target default.target. Sep 4 00:59:35.187217 systemd[2053]: Created slice app.slice - User Application Slice. Sep 4 00:59:35.187251 systemd[2053]: Reached target paths.target - Paths. Sep 4 00:59:35.187271 systemd[2053]: Reached target timers.target - Timers. Sep 4 00:59:35.187908 systemd[2053]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 00:59:35.193344 systemd[2053]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 00:59:35.193373 systemd[2053]: Reached target sockets.target - Sockets. Sep 4 00:59:35.193395 systemd[2053]: Reached target basic.target - Basic System. Sep 4 00:59:35.193416 systemd[2053]: Reached target default.target - Main User Target. Sep 4 00:59:35.193430 systemd[2053]: Startup finished in 108ms. Sep 4 00:59:35.193493 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 00:59:35.202791 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 00:59:35.269448 systemd[1]: Started sshd@1-147.75.202.229:22-147.75.109.163:40720.service - OpenSSH per-connection server daemon (147.75.109.163:40720). Sep 4 00:59:35.318119 sshd[2064]: Accepted publickey for core from 147.75.109.163 port 40720 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:35.318752 sshd-session[2064]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:35.321218 systemd-logind[1900]: New session 2 of user core. Sep 4 00:59:35.333830 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 00:59:35.388602 sshd[2066]: Connection closed by 147.75.109.163 port 40720 Sep 4 00:59:35.388732 sshd-session[2064]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:35.400929 systemd[1]: sshd@1-147.75.202.229:22-147.75.109.163:40720.service: Deactivated successfully. Sep 4 00:59:35.401738 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 00:59:35.402185 systemd-logind[1900]: Session 2 logged out. Waiting for processes to exit. Sep 4 00:59:35.403477 systemd[1]: Started sshd@2-147.75.202.229:22-147.75.109.163:40726.service - OpenSSH per-connection server daemon (147.75.109.163:40726). Sep 4 00:59:35.414245 systemd-logind[1900]: Removed session 2. Sep 4 00:59:35.443020 sshd[2072]: Accepted publickey for core from 147.75.109.163 port 40726 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:35.443600 sshd-session[2072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:35.446270 systemd-logind[1900]: New session 3 of user core. Sep 4 00:59:35.462840 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 00:59:35.517605 sshd[2075]: Connection closed by 147.75.109.163 port 40726 Sep 4 00:59:35.517762 sshd-session[2072]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:35.519426 systemd[1]: sshd@2-147.75.202.229:22-147.75.109.163:40726.service: Deactivated successfully. Sep 4 00:59:35.520295 systemd[1]: session-3.scope: Deactivated successfully. Sep 4 00:59:35.520785 systemd-logind[1900]: Session 3 logged out. Waiting for processes to exit. Sep 4 00:59:35.521356 systemd-logind[1900]: Removed session 3. Sep 4 00:59:36.038852 login[1991]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 00:59:36.039543 login[1992]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 00:59:36.042202 systemd-logind[1900]: New session 5 of user core. Sep 4 00:59:36.052933 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 00:59:36.055041 systemd-logind[1900]: New session 4 of user core. Sep 4 00:59:36.056447 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 00:59:37.496157 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:37.650924 coreos-metadata[1867]: Sep 04 00:59:37.650 INFO Fetch successful Sep 4 00:59:37.669429 coreos-metadata[1954]: Sep 04 00:59:37.669 INFO Fetch successful Sep 4 00:59:37.704544 unknown[1954]: wrote ssh authorized keys file for user: core Sep 4 00:59:37.728218 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 4 00:59:37.729404 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 4 00:59:37.738888 update-ssh-keys[2108]: Updated "/home/core/.ssh/authorized_keys" Sep 4 00:59:37.739320 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 4 00:59:37.740026 systemd[1]: Finished sshkeys.service. Sep 4 00:59:38.180164 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 4 00:59:38.181481 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 00:59:38.182027 systemd[1]: Startup finished in 5.253s (kernel) + 23.473s (initrd) + 11.489s (userspace) = 40.216s. Sep 4 00:59:43.818728 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 00:59:43.822357 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:59:44.103766 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:59:44.111287 (kubelet)[2125]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:59:44.148346 kubelet[2125]: E0904 00:59:44.148290 2125 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:59:44.151069 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:59:44.151170 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:59:44.151375 systemd[1]: kubelet.service: Consumed 183ms CPU time, 114.9M memory peak. Sep 4 00:59:45.544104 systemd[1]: Started sshd@3-147.75.202.229:22-147.75.109.163:38452.service - OpenSSH per-connection server daemon (147.75.109.163:38452). Sep 4 00:59:45.585503 sshd[2143]: Accepted publickey for core from 147.75.109.163 port 38452 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:45.586175 sshd-session[2143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:45.589115 systemd-logind[1900]: New session 6 of user core. Sep 4 00:59:45.603193 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 00:59:45.663245 sshd[2145]: Connection closed by 147.75.109.163 port 38452 Sep 4 00:59:45.663398 sshd-session[2143]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:45.673028 systemd[1]: sshd@3-147.75.202.229:22-147.75.109.163:38452.service: Deactivated successfully. Sep 4 00:59:45.673965 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 00:59:45.674484 systemd-logind[1900]: Session 6 logged out. Waiting for processes to exit. Sep 4 00:59:45.675976 systemd[1]: Started sshd@4-147.75.202.229:22-147.75.109.163:38460.service - OpenSSH per-connection server daemon (147.75.109.163:38460). Sep 4 00:59:45.676504 systemd-logind[1900]: Removed session 6. Sep 4 00:59:45.707977 sshd[2151]: Accepted publickey for core from 147.75.109.163 port 38460 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:45.708635 sshd-session[2151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:45.711603 systemd-logind[1900]: New session 7 of user core. Sep 4 00:59:45.727245 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 00:59:45.787295 sshd[2153]: Connection closed by 147.75.109.163 port 38460 Sep 4 00:59:45.787984 sshd-session[2151]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:45.808564 systemd[1]: sshd@4-147.75.202.229:22-147.75.109.163:38460.service: Deactivated successfully. Sep 4 00:59:45.813254 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 00:59:45.815847 systemd-logind[1900]: Session 7 logged out. Waiting for processes to exit. Sep 4 00:59:45.823001 systemd[1]: Started sshd@5-147.75.202.229:22-147.75.109.163:38474.service - OpenSSH per-connection server daemon (147.75.109.163:38474). Sep 4 00:59:45.824797 systemd-logind[1900]: Removed session 7. Sep 4 00:59:45.911270 sshd[2159]: Accepted publickey for core from 147.75.109.163 port 38474 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:45.914603 sshd-session[2159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:45.927863 systemd-logind[1900]: New session 8 of user core. Sep 4 00:59:45.942098 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 00:59:46.001477 sshd[2161]: Connection closed by 147.75.109.163 port 38474 Sep 4 00:59:46.001602 sshd-session[2159]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:46.018956 systemd[1]: sshd@5-147.75.202.229:22-147.75.109.163:38474.service: Deactivated successfully. Sep 4 00:59:46.019840 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 00:59:46.020441 systemd-logind[1900]: Session 8 logged out. Waiting for processes to exit. Sep 4 00:59:46.021653 systemd[1]: Started sshd@6-147.75.202.229:22-147.75.109.163:38478.service - OpenSSH per-connection server daemon (147.75.109.163:38478). Sep 4 00:59:46.022323 systemd-logind[1900]: Removed session 8. Sep 4 00:59:46.060842 sshd[2167]: Accepted publickey for core from 147.75.109.163 port 38478 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:46.063015 sshd-session[2167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:46.074657 systemd-logind[1900]: New session 9 of user core. Sep 4 00:59:46.088132 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 00:59:46.157312 sudo[2171]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 00:59:46.157454 sudo[2171]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:59:46.172296 sudo[2171]: pam_unix(sudo:session): session closed for user root Sep 4 00:59:46.173160 sshd[2170]: Connection closed by 147.75.109.163 port 38478 Sep 4 00:59:46.173318 sshd-session[2167]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:46.184058 systemd[1]: sshd@6-147.75.202.229:22-147.75.109.163:38478.service: Deactivated successfully. Sep 4 00:59:46.185051 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 00:59:46.185680 systemd-logind[1900]: Session 9 logged out. Waiting for processes to exit. Sep 4 00:59:46.187256 systemd[1]: Started sshd@7-147.75.202.229:22-147.75.109.163:38486.service - OpenSSH per-connection server daemon (147.75.109.163:38486). Sep 4 00:59:46.187723 systemd-logind[1900]: Removed session 9. Sep 4 00:59:46.228048 sshd[2177]: Accepted publickey for core from 147.75.109.163 port 38486 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:46.229093 sshd-session[2177]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:46.233144 systemd-logind[1900]: New session 10 of user core. Sep 4 00:59:46.245853 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 00:59:46.304619 sudo[2182]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 00:59:46.304767 sudo[2182]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:59:46.307326 sudo[2182]: pam_unix(sudo:session): session closed for user root Sep 4 00:59:46.309918 sudo[2181]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 00:59:46.310056 sudo[2181]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:59:46.315606 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 00:59:46.351834 augenrules[2204]: No rules Sep 4 00:59:46.352627 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 00:59:46.352914 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 00:59:46.353920 sudo[2181]: pam_unix(sudo:session): session closed for user root Sep 4 00:59:46.355228 sshd[2180]: Connection closed by 147.75.109.163 port 38486 Sep 4 00:59:46.355576 sshd-session[2177]: pam_unix(sshd:session): session closed for user core Sep 4 00:59:46.371645 systemd[1]: sshd@7-147.75.202.229:22-147.75.109.163:38486.service: Deactivated successfully. Sep 4 00:59:46.375438 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 00:59:46.377784 systemd-logind[1900]: Session 10 logged out. Waiting for processes to exit. Sep 4 00:59:46.384114 systemd[1]: Started sshd@8-147.75.202.229:22-147.75.109.163:38500.service - OpenSSH per-connection server daemon (147.75.109.163:38500). Sep 4 00:59:46.385881 systemd-logind[1900]: Removed session 10. Sep 4 00:59:46.472088 sshd[2213]: Accepted publickey for core from 147.75.109.163 port 38500 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:59:46.473172 sshd-session[2213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:59:46.477642 systemd-logind[1900]: New session 11 of user core. Sep 4 00:59:46.491929 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 00:59:46.554939 sudo[2216]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 00:59:46.555729 sudo[2216]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:59:46.938187 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 00:59:46.950058 (dockerd)[2243]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 00:59:47.129380 dockerd[2243]: time="2025-09-04T00:59:47.129325650Z" level=info msg="Starting up" Sep 4 00:59:47.129676 dockerd[2243]: time="2025-09-04T00:59:47.129662056Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 00:59:47.156066 dockerd[2243]: time="2025-09-04T00:59:47.156019954Z" level=info msg="Loading containers: start." Sep 4 00:59:47.167677 kernel: Initializing XFRM netlink socket Sep 4 00:59:47.298815 systemd-timesyncd[1830]: Network configuration changed, trying to establish connection. Sep 4 00:59:47.317615 systemd-networkd[1828]: docker0: Link UP Sep 4 00:59:47.319290 dockerd[2243]: time="2025-09-04T00:59:47.319245908Z" level=info msg="Loading containers: done." Sep 4 00:59:47.325873 dockerd[2243]: time="2025-09-04T00:59:47.325828723Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 00:59:47.325873 dockerd[2243]: time="2025-09-04T00:59:47.325865882Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 4 00:59:47.325960 dockerd[2243]: time="2025-09-04T00:59:47.325918488Z" level=info msg="Initializing buildkit" Sep 4 00:59:47.335941 dockerd[2243]: time="2025-09-04T00:59:47.335928545Z" level=info msg="Completed buildkit initialization" Sep 4 00:59:47.339224 dockerd[2243]: time="2025-09-04T00:59:47.339167526Z" level=info msg="Daemon has completed initialization" Sep 4 00:59:47.339224 dockerd[2243]: time="2025-09-04T00:59:47.339198551Z" level=info msg="API listen on /run/docker.sock" Sep 4 00:59:47.339287 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 00:59:48.368738 systemd-resolved[1829]: Clock change detected. Flushing caches. Sep 4 00:59:48.368833 systemd-timesyncd[1830]: Contacted time server [2604:a880:1:20::17:5001]:123 (2.flatcar.pool.ntp.org). Sep 4 00:59:48.368857 systemd-timesyncd[1830]: Initial clock synchronization to Thu 2025-09-04 00:59:48.368714 UTC. Sep 4 00:59:49.135936 containerd[1911]: time="2025-09-04T00:59:49.135799816Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 4 00:59:49.806402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1718057973.mount: Deactivated successfully. Sep 4 00:59:50.648232 containerd[1911]: time="2025-09-04T00:59:50.648181848Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:50.648441 containerd[1911]: time="2025-09-04T00:59:50.648358140Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078664" Sep 4 00:59:50.648762 containerd[1911]: time="2025-09-04T00:59:50.648720569Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:50.650430 containerd[1911]: time="2025-09-04T00:59:50.650389836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:50.650922 containerd[1911]: time="2025-09-04T00:59:50.650880559Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 1.515006521s" Sep 4 00:59:50.650922 containerd[1911]: time="2025-09-04T00:59:50.650897945Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Sep 4 00:59:50.651356 containerd[1911]: time="2025-09-04T00:59:50.651297153Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 4 00:59:51.782095 containerd[1911]: time="2025-09-04T00:59:51.782046917Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:51.782318 containerd[1911]: time="2025-09-04T00:59:51.782188237Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018066" Sep 4 00:59:51.782539 containerd[1911]: time="2025-09-04T00:59:51.782526853Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:51.783787 containerd[1911]: time="2025-09-04T00:59:51.783775712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:51.784291 containerd[1911]: time="2025-09-04T00:59:51.784278578Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 1.132966449s" Sep 4 00:59:51.784330 containerd[1911]: time="2025-09-04T00:59:51.784293275Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Sep 4 00:59:51.784542 containerd[1911]: time="2025-09-04T00:59:51.784530492Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 4 00:59:52.837089 containerd[1911]: time="2025-09-04T00:59:52.837036333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:52.837300 containerd[1911]: time="2025-09-04T00:59:52.837234418Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153911" Sep 4 00:59:52.837572 containerd[1911]: time="2025-09-04T00:59:52.837533298Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:52.839318 containerd[1911]: time="2025-09-04T00:59:52.839278211Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:52.839709 containerd[1911]: time="2025-09-04T00:59:52.839687734Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.055141044s" Sep 4 00:59:52.839709 containerd[1911]: time="2025-09-04T00:59:52.839702808Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Sep 4 00:59:52.840027 containerd[1911]: time="2025-09-04T00:59:52.840014379Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 4 00:59:53.745525 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2491841460.mount: Deactivated successfully. Sep 4 00:59:53.948194 containerd[1911]: time="2025-09-04T00:59:53.948168859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:53.948409 containerd[1911]: time="2025-09-04T00:59:53.948352400Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899626" Sep 4 00:59:53.948709 containerd[1911]: time="2025-09-04T00:59:53.948698461Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:53.949456 containerd[1911]: time="2025-09-04T00:59:53.949446637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:53.949786 containerd[1911]: time="2025-09-04T00:59:53.949773620Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 1.109742123s" Sep 4 00:59:53.949821 containerd[1911]: time="2025-09-04T00:59:53.949789260Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Sep 4 00:59:53.950046 containerd[1911]: time="2025-09-04T00:59:53.950036282Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 4 00:59:54.476553 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4193194244.mount: Deactivated successfully. Sep 4 00:59:55.047684 containerd[1911]: time="2025-09-04T00:59:55.047629316Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:55.047902 containerd[1911]: time="2025-09-04T00:59:55.047842696Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 4 00:59:55.048202 containerd[1911]: time="2025-09-04T00:59:55.048159963Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:55.049553 containerd[1911]: time="2025-09-04T00:59:55.049512435Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:55.050098 containerd[1911]: time="2025-09-04T00:59:55.050082943Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.100031491s" Sep 4 00:59:55.050137 containerd[1911]: time="2025-09-04T00:59:55.050100267Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 4 00:59:55.050363 containerd[1911]: time="2025-09-04T00:59:55.050352227Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 00:59:55.204207 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 00:59:55.205464 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:59:55.468282 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:59:55.470396 (kubelet)[2606]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:59:55.493589 kubelet[2606]: E0904 00:59:55.493563 2606 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:59:55.495275 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:59:55.495381 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:59:55.495593 systemd[1]: kubelet.service: Consumed 113ms CPU time, 112.5M memory peak. Sep 4 00:59:55.668806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2604000091.mount: Deactivated successfully. Sep 4 00:59:55.670191 containerd[1911]: time="2025-09-04T00:59:55.670175629Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:59:55.670353 containerd[1911]: time="2025-09-04T00:59:55.670340810Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 00:59:55.670840 containerd[1911]: time="2025-09-04T00:59:55.670814612Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:59:55.671742 containerd[1911]: time="2025-09-04T00:59:55.671722483Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:59:55.672213 containerd[1911]: time="2025-09-04T00:59:55.672203097Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 621.835008ms" Sep 4 00:59:55.672231 containerd[1911]: time="2025-09-04T00:59:55.672217996Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 00:59:55.672544 containerd[1911]: time="2025-09-04T00:59:55.672491579Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 4 00:59:56.218041 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount376779838.mount: Deactivated successfully. Sep 4 00:59:57.267607 containerd[1911]: time="2025-09-04T00:59:57.267580853Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:57.267818 containerd[1911]: time="2025-09-04T00:59:57.267758391Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377871" Sep 4 00:59:57.268137 containerd[1911]: time="2025-09-04T00:59:57.268097565Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:57.269553 containerd[1911]: time="2025-09-04T00:59:57.269512832Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:59:57.270480 containerd[1911]: time="2025-09-04T00:59:57.270466081Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.597961009s" Sep 4 00:59:57.270510 containerd[1911]: time="2025-09-04T00:59:57.270483231Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 4 00:59:59.808585 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:59:59.808689 systemd[1]: kubelet.service: Consumed 113ms CPU time, 112.5M memory peak. Sep 4 00:59:59.810024 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:59:59.824469 systemd[1]: Reload requested from client PID 2729 ('systemctl') (unit session-11.scope)... Sep 4 00:59:59.824475 systemd[1]: Reloading... Sep 4 00:59:59.860832 zram_generator::config[2773]: No configuration found. Sep 4 00:59:59.917794 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 01:00:00.006184 systemd[1]: Reloading finished in 181 ms. Sep 4 01:00:00.057973 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 01:00:00.058019 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 01:00:00.058149 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 01:00:00.059357 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 01:00:00.347910 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 01:00:00.350069 (kubelet)[2839]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 01:00:00.370063 kubelet[2839]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 01:00:00.370063 kubelet[2839]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 01:00:00.370063 kubelet[2839]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 01:00:00.370295 kubelet[2839]: I0904 01:00:00.370085 2839 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 01:00:00.784088 kubelet[2839]: I0904 01:00:00.784046 2839 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 4 01:00:00.784088 kubelet[2839]: I0904 01:00:00.784057 2839 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 01:00:00.784207 kubelet[2839]: I0904 01:00:00.784156 2839 server.go:956] "Client rotation is on, will bootstrap in background" Sep 4 01:00:00.817747 kubelet[2839]: I0904 01:00:00.817717 2839 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 01:00:00.818024 kubelet[2839]: E0904 01:00:00.817994 2839 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://147.75.202.229:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.75.202.229:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 4 01:00:00.822910 kubelet[2839]: I0904 01:00:00.822897 2839 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 01:00:00.831526 kubelet[2839]: I0904 01:00:00.831490 2839 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 01:00:00.831635 kubelet[2839]: I0904 01:00:00.831591 2839 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 01:00:00.831702 kubelet[2839]: I0904 01:00:00.831608 2839 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-d84d506c1c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 01:00:00.831772 kubelet[2839]: I0904 01:00:00.831706 2839 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 01:00:00.831772 kubelet[2839]: I0904 01:00:00.831711 2839 container_manager_linux.go:303] "Creating device plugin manager" Sep 4 01:00:00.831805 kubelet[2839]: I0904 01:00:00.831786 2839 state_mem.go:36] "Initialized new in-memory state store" Sep 4 01:00:00.834345 kubelet[2839]: I0904 01:00:00.834336 2839 kubelet.go:480] "Attempting to sync node with API server" Sep 4 01:00:00.834392 kubelet[2839]: I0904 01:00:00.834348 2839 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 01:00:00.834392 kubelet[2839]: I0904 01:00:00.834363 2839 kubelet.go:386] "Adding apiserver pod source" Sep 4 01:00:00.834426 kubelet[2839]: I0904 01:00:00.834396 2839 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 01:00:00.838333 kubelet[2839]: I0904 01:00:00.838322 2839 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 4 01:00:00.838651 kubelet[2839]: I0904 01:00:00.838645 2839 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 4 01:00:00.839241 kubelet[2839]: W0904 01:00:00.839234 2839 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 01:00:00.841570 kubelet[2839]: I0904 01:00:00.841557 2839 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 01:00:00.841624 kubelet[2839]: I0904 01:00:00.841617 2839 server.go:1289] "Started kubelet" Sep 4 01:00:00.841652 kubelet[2839]: E0904 01:00:00.841635 2839 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://147.75.202.229:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-d84d506c1c&limit=500&resourceVersion=0\": dial tcp 147.75.202.229:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 4 01:00:00.841689 kubelet[2839]: I0904 01:00:00.841670 2839 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 01:00:00.841906 kubelet[2839]: I0904 01:00:00.841826 2839 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 01:00:00.842024 kubelet[2839]: E0904 01:00:00.842004 2839 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://147.75.202.229:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.75.202.229:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 4 01:00:00.842306 kubelet[2839]: I0904 01:00:00.842287 2839 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 01:00:00.842887 kubelet[2839]: E0904 01:00:00.842878 2839 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 01:00:00.842919 kubelet[2839]: I0904 01:00:00.842909 2839 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 01:00:00.842961 kubelet[2839]: I0904 01:00:00.842954 2839 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 01:00:00.842988 kubelet[2839]: E0904 01:00:00.842974 2839 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:00.843017 kubelet[2839]: I0904 01:00:00.842981 2839 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 01:00:00.843017 kubelet[2839]: I0904 01:00:00.842989 2839 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 01:00:00.845258 kubelet[2839]: I0904 01:00:00.845251 2839 server.go:317] "Adding debug handlers to kubelet server" Sep 4 01:00:00.845258 kubelet[2839]: I0904 01:00:00.845254 2839 reconciler.go:26] "Reconciler: start to sync state" Sep 4 01:00:00.845396 kubelet[2839]: E0904 01:00:00.845383 2839 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://147.75.202.229:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.75.202.229:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 4 01:00:00.845441 kubelet[2839]: E0904 01:00:00.845420 2839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.202.229:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-d84d506c1c?timeout=10s\": dial tcp 147.75.202.229:6443: connect: connection refused" interval="200ms" Sep 4 01:00:00.845492 kubelet[2839]: I0904 01:00:00.845478 2839 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 01:00:00.845871 kubelet[2839]: I0904 01:00:00.845862 2839 factory.go:223] Registration of the containerd container factory successfully Sep 4 01:00:00.845871 kubelet[2839]: I0904 01:00:00.845869 2839 factory.go:223] Registration of the systemd container factory successfully Sep 4 01:00:00.847046 kubelet[2839]: E0904 01:00:00.846147 2839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.75.202.229:6443/api/v1/namespaces/default/events\": dial tcp 147.75.202.229:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-n-d84d506c1c.1861ee88c52306d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-n-d84d506c1c,UID:ci-4372.1.0-n-d84d506c1c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-n-d84d506c1c,},FirstTimestamp:2025-09-04 01:00:00.841574101 +0000 UTC m=+0.489512140,LastTimestamp:2025-09-04 01:00:00.841574101 +0000 UTC m=+0.489512140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-n-d84d506c1c,}" Sep 4 01:00:00.851169 kubelet[2839]: I0904 01:00:00.851136 2839 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 01:00:00.851169 kubelet[2839]: I0904 01:00:00.851142 2839 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 01:00:00.851169 kubelet[2839]: I0904 01:00:00.851169 2839 state_mem.go:36] "Initialized new in-memory state store" Sep 4 01:00:00.852469 kubelet[2839]: I0904 01:00:00.852432 2839 policy_none.go:49] "None policy: Start" Sep 4 01:00:00.852469 kubelet[2839]: I0904 01:00:00.852439 2839 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 01:00:00.852469 kubelet[2839]: I0904 01:00:00.852444 2839 state_mem.go:35] "Initializing new in-memory state store" Sep 4 01:00:00.854417 kubelet[2839]: I0904 01:00:00.854401 2839 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 4 01:00:00.854949 kubelet[2839]: I0904 01:00:00.854939 2839 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 4 01:00:00.854949 kubelet[2839]: I0904 01:00:00.854951 2839 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 4 01:00:00.855002 kubelet[2839]: I0904 01:00:00.854961 2839 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 01:00:00.855002 kubelet[2839]: I0904 01:00:00.854970 2839 kubelet.go:2436] "Starting kubelet main sync loop" Sep 4 01:00:00.854989 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 01:00:00.855182 kubelet[2839]: E0904 01:00:00.854997 2839 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 01:00:00.856833 kubelet[2839]: E0904 01:00:00.856789 2839 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://147.75.202.229:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.75.202.229:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 4 01:00:00.867450 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 01:00:00.869888 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 01:00:00.883498 kubelet[2839]: E0904 01:00:00.883455 2839 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 4 01:00:00.883611 kubelet[2839]: I0904 01:00:00.883569 2839 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 01:00:00.883611 kubelet[2839]: I0904 01:00:00.883577 2839 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 01:00:00.883694 kubelet[2839]: I0904 01:00:00.883686 2839 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 01:00:00.884156 kubelet[2839]: E0904 01:00:00.884114 2839 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 01:00:00.884156 kubelet[2839]: E0904 01:00:00.884142 2839 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:00.965798 systemd[1]: Created slice kubepods-burstable-pod8f2f9bcd218aa96c3b33aaf6b645a9ff.slice - libcontainer container kubepods-burstable-pod8f2f9bcd218aa96c3b33aaf6b645a9ff.slice. Sep 4 01:00:00.985218 kubelet[2839]: I0904 01:00:00.985170 2839 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:00.985449 kubelet[2839]: E0904 01:00:00.985405 2839 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.202.229:6443/api/v1/nodes\": dial tcp 147.75.202.229:6443: connect: connection refused" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:00.985922 kubelet[2839]: E0904 01:00:00.985882 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:00.988571 systemd[1]: Created slice kubepods-burstable-podc35b01e0d8394919006dd99e480a2620.slice - libcontainer container kubepods-burstable-podc35b01e0d8394919006dd99e480a2620.slice. Sep 4 01:00:00.989998 kubelet[2839]: E0904 01:00:00.989957 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.005152 systemd[1]: Created slice kubepods-burstable-podee14763a409cb98df895e074db1cb54e.slice - libcontainer container kubepods-burstable-podee14763a409cb98df895e074db1cb54e.slice. Sep 4 01:00:01.007560 kubelet[2839]: E0904 01:00:01.007511 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.046897 kubelet[2839]: I0904 01:00:01.046652 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.046897 kubelet[2839]: I0904 01:00:01.046732 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.046897 kubelet[2839]: I0904 01:00:01.046832 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.046897 kubelet[2839]: I0904 01:00:01.046888 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.047435 kubelet[2839]: I0904 01:00:01.046932 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.047435 kubelet[2839]: I0904 01:00:01.046978 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.047435 kubelet[2839]: I0904 01:00:01.047019 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.047435 kubelet[2839]: E0904 01:00:01.047027 2839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.75.202.229:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-d84d506c1c?timeout=10s\": dial tcp 147.75.202.229:6443: connect: connection refused" interval="400ms" Sep 4 01:00:01.047435 kubelet[2839]: I0904 01:00:01.047059 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.047862 kubelet[2839]: I0904 01:00:01.047154 2839 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ee14763a409cb98df895e074db1cb54e-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-d84d506c1c\" (UID: \"ee14763a409cb98df895e074db1cb54e\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.190331 kubelet[2839]: I0904 01:00:01.190228 2839 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.191039 kubelet[2839]: E0904 01:00:01.190962 2839 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.75.202.229:6443/api/v1/nodes\": dial tcp 147.75.202.229:6443: connect: connection refused" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.288599 containerd[1911]: time="2025-09-04T01:00:01.288505752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-d84d506c1c,Uid:8f2f9bcd218aa96c3b33aaf6b645a9ff,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:01.291745 containerd[1911]: time="2025-09-04T01:00:01.291621239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-d84d506c1c,Uid:c35b01e0d8394919006dd99e480a2620,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:01.308253 containerd[1911]: time="2025-09-04T01:00:01.308200931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-d84d506c1c,Uid:ee14763a409cb98df895e074db1cb54e,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:01.311187 containerd[1911]: time="2025-09-04T01:00:01.311138896Z" level=info msg="connecting to shim 977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893" address="unix:///run/containerd/s/da9c849671c69700e8cf1809757a12648b0419fdebb5a413e801a094ff8c260d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:01.311257 containerd[1911]: time="2025-09-04T01:00:01.311153825Z" level=info msg="connecting to shim bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad" address="unix:///run/containerd/s/a6f9232516d22af0e7a1bcfe69ef5191f42a00eb3a605f3421f196110f6dc6f5" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:01.315093 containerd[1911]: time="2025-09-04T01:00:01.315058110Z" level=info msg="connecting to shim 124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07" address="unix:///run/containerd/s/00296640bcd7f13f5d5d676d2ee05812684aa811227e2206aa8186e8f3d7ad20" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:01.334113 systemd[1]: Started cri-containerd-977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893.scope - libcontainer container 977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893. Sep 4 01:00:01.335059 systemd[1]: Started cri-containerd-bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad.scope - libcontainer container bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad. Sep 4 01:00:01.336970 systemd[1]: Started cri-containerd-124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07.scope - libcontainer container 124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07. Sep 4 01:00:01.360066 containerd[1911]: time="2025-09-04T01:00:01.360044379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-d84d506c1c,Uid:c35b01e0d8394919006dd99e480a2620,Namespace:kube-system,Attempt:0,} returns sandbox id \"977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893\"" Sep 4 01:00:01.361276 containerd[1911]: time="2025-09-04T01:00:01.361257920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-d84d506c1c,Uid:8f2f9bcd218aa96c3b33aaf6b645a9ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad\"" Sep 4 01:00:01.362118 containerd[1911]: time="2025-09-04T01:00:01.362103543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-d84d506c1c,Uid:ee14763a409cb98df895e074db1cb54e,Namespace:kube-system,Attempt:0,} returns sandbox id \"124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07\"" Sep 4 01:00:01.362323 containerd[1911]: time="2025-09-04T01:00:01.362313165Z" level=info msg="CreateContainer within sandbox \"977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 01:00:01.362688 containerd[1911]: time="2025-09-04T01:00:01.362678442Z" level=info msg="CreateContainer within sandbox \"bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 01:00:01.363361 containerd[1911]: time="2025-09-04T01:00:01.363347730Z" level=info msg="CreateContainer within sandbox \"124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 01:00:01.366327 containerd[1911]: time="2025-09-04T01:00:01.366314523Z" level=info msg="Container 6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:01.367236 containerd[1911]: time="2025-09-04T01:00:01.367225440Z" level=info msg="Container a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:01.368044 containerd[1911]: time="2025-09-04T01:00:01.368012361Z" level=info msg="Container 2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:01.369716 containerd[1911]: time="2025-09-04T01:00:01.369705446Z" level=info msg="CreateContainer within sandbox \"977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261\"" Sep 4 01:00:01.370219 containerd[1911]: time="2025-09-04T01:00:01.370204855Z" level=info msg="StartContainer for \"6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261\"" Sep 4 01:00:01.370754 containerd[1911]: time="2025-09-04T01:00:01.370738023Z" level=info msg="CreateContainer within sandbox \"bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef\"" Sep 4 01:00:01.370857 containerd[1911]: time="2025-09-04T01:00:01.370845377Z" level=info msg="connecting to shim 6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261" address="unix:///run/containerd/s/da9c849671c69700e8cf1809757a12648b0419fdebb5a413e801a094ff8c260d" protocol=ttrpc version=3 Sep 4 01:00:01.370909 containerd[1911]: time="2025-09-04T01:00:01.370898163Z" level=info msg="StartContainer for \"a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef\"" Sep 4 01:00:01.371399 containerd[1911]: time="2025-09-04T01:00:01.371387779Z" level=info msg="connecting to shim a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef" address="unix:///run/containerd/s/a6f9232516d22af0e7a1bcfe69ef5191f42a00eb3a605f3421f196110f6dc6f5" protocol=ttrpc version=3 Sep 4 01:00:01.371833 containerd[1911]: time="2025-09-04T01:00:01.371821747Z" level=info msg="CreateContainer within sandbox \"124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003\"" Sep 4 01:00:01.372023 containerd[1911]: time="2025-09-04T01:00:01.372011654Z" level=info msg="StartContainer for \"2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003\"" Sep 4 01:00:01.372512 containerd[1911]: time="2025-09-04T01:00:01.372470239Z" level=info msg="connecting to shim 2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003" address="unix:///run/containerd/s/00296640bcd7f13f5d5d676d2ee05812684aa811227e2206aa8186e8f3d7ad20" protocol=ttrpc version=3 Sep 4 01:00:01.396015 systemd[1]: Started cri-containerd-2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003.scope - libcontainer container 2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003. Sep 4 01:00:01.396618 systemd[1]: Started cri-containerd-6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261.scope - libcontainer container 6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261. Sep 4 01:00:01.397252 systemd[1]: Started cri-containerd-a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef.scope - libcontainer container a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef. Sep 4 01:00:01.424658 containerd[1911]: time="2025-09-04T01:00:01.424632461Z" level=info msg="StartContainer for \"2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003\" returns successfully" Sep 4 01:00:01.424785 containerd[1911]: time="2025-09-04T01:00:01.424691485Z" level=info msg="StartContainer for \"6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261\" returns successfully" Sep 4 01:00:01.425389 containerd[1911]: time="2025-09-04T01:00:01.425375874Z" level=info msg="StartContainer for \"a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef\" returns successfully" Sep 4 01:00:01.592149 kubelet[2839]: I0904 01:00:01.592072 2839 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.858237 kubelet[2839]: E0904 01:00:01.858183 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.858699 kubelet[2839]: E0904 01:00:01.858691 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:01.859388 kubelet[2839]: E0904 01:00:01.859381 2839 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.035296 kubelet[2839]: E0904 01:00:02.035266 2839 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372.1.0-n-d84d506c1c\" not found" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.137656 kubelet[2839]: I0904 01:00:02.137575 2839 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.137656 kubelet[2839]: E0904 01:00:02.137613 2839 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4372.1.0-n-d84d506c1c\": node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:02.146273 kubelet[2839]: E0904 01:00:02.146247 2839 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:02.246644 kubelet[2839]: E0904 01:00:02.246625 2839 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:02.347854 kubelet[2839]: E0904 01:00:02.347730 2839 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:02.444189 kubelet[2839]: I0904 01:00:02.443991 2839 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.466492 kubelet[2839]: E0904 01:00:02.466396 2839 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.466492 kubelet[2839]: I0904 01:00:02.466447 2839 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.469824 kubelet[2839]: E0904 01:00:02.469738 2839 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-n-d84d506c1c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.469824 kubelet[2839]: I0904 01:00:02.469806 2839 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.473330 kubelet[2839]: E0904 01:00:02.473240 2839 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.839154 kubelet[2839]: I0904 01:00:02.839062 2839 apiserver.go:52] "Watching apiserver" Sep 4 01:00:02.843535 kubelet[2839]: I0904 01:00:02.843454 2839 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 01:00:02.861398 kubelet[2839]: I0904 01:00:02.861306 2839 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.861628 kubelet[2839]: I0904 01:00:02.861523 2839 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.865189 kubelet[2839]: E0904 01:00:02.865100 2839 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372.1.0-n-d84d506c1c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:02.865189 kubelet[2839]: E0904 01:00:02.865134 2839 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:04.298639 systemd[1]: Reload requested from client PID 3164 ('systemctl') (unit session-11.scope)... Sep 4 01:00:04.298646 systemd[1]: Reloading... Sep 4 01:00:04.342884 zram_generator::config[3209]: No configuration found. Sep 4 01:00:04.403622 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 01:00:04.500208 systemd[1]: Reloading finished in 201 ms. Sep 4 01:00:04.518057 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 01:00:04.527378 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 01:00:04.527510 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 01:00:04.527536 systemd[1]: kubelet.service: Consumed 874ms CPU time, 136.9M memory peak. Sep 4 01:00:04.528531 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 01:00:04.873038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 01:00:04.896592 (kubelet)[3273]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 01:00:04.943619 kubelet[3273]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 01:00:04.943619 kubelet[3273]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 01:00:04.943619 kubelet[3273]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 01:00:04.943930 kubelet[3273]: I0904 01:00:04.943660 3273 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 01:00:04.949354 kubelet[3273]: I0904 01:00:04.949316 3273 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 4 01:00:04.949354 kubelet[3273]: I0904 01:00:04.949332 3273 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 01:00:04.949517 kubelet[3273]: I0904 01:00:04.949488 3273 server.go:956] "Client rotation is on, will bootstrap in background" Sep 4 01:00:04.950602 kubelet[3273]: I0904 01:00:04.950570 3273 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 4 01:00:04.952447 kubelet[3273]: I0904 01:00:04.952403 3273 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 01:00:04.955000 kubelet[3273]: I0904 01:00:04.954985 3273 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 01:00:04.965324 kubelet[3273]: I0904 01:00:04.965310 3273 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 01:00:04.965470 kubelet[3273]: I0904 01:00:04.965453 3273 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 01:00:04.965590 kubelet[3273]: I0904 01:00:04.965473 3273 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-d84d506c1c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 01:00:04.965665 kubelet[3273]: I0904 01:00:04.965599 3273 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 01:00:04.965665 kubelet[3273]: I0904 01:00:04.965607 3273 container_manager_linux.go:303] "Creating device plugin manager" Sep 4 01:00:04.965665 kubelet[3273]: I0904 01:00:04.965644 3273 state_mem.go:36] "Initialized new in-memory state store" Sep 4 01:00:04.965865 kubelet[3273]: I0904 01:00:04.965818 3273 kubelet.go:480] "Attempting to sync node with API server" Sep 4 01:00:04.965865 kubelet[3273]: I0904 01:00:04.965829 3273 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 01:00:04.965865 kubelet[3273]: I0904 01:00:04.965846 3273 kubelet.go:386] "Adding apiserver pod source" Sep 4 01:00:04.965865 kubelet[3273]: I0904 01:00:04.965854 3273 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 01:00:04.966475 kubelet[3273]: I0904 01:00:04.966459 3273 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 4 01:00:04.966946 kubelet[3273]: I0904 01:00:04.966906 3273 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 4 01:00:04.968668 kubelet[3273]: I0904 01:00:04.968630 3273 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 01:00:04.968668 kubelet[3273]: I0904 01:00:04.968664 3273 server.go:1289] "Started kubelet" Sep 4 01:00:04.968765 kubelet[3273]: I0904 01:00:04.968716 3273 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 01:00:04.968796 kubelet[3273]: I0904 01:00:04.968731 3273 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 01:00:04.969070 kubelet[3273]: I0904 01:00:04.969053 3273 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 01:00:04.971181 kubelet[3273]: I0904 01:00:04.971156 3273 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 01:00:04.971557 kubelet[3273]: I0904 01:00:04.971315 3273 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 01:00:04.971557 kubelet[3273]: E0904 01:00:04.971331 3273 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-d84d506c1c\" not found" Sep 4 01:00:04.971557 kubelet[3273]: I0904 01:00:04.971340 3273 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 01:00:04.971557 kubelet[3273]: I0904 01:00:04.971365 3273 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 01:00:04.971765 kubelet[3273]: I0904 01:00:04.971592 3273 reconciler.go:26] "Reconciler: start to sync state" Sep 4 01:00:04.972371 kubelet[3273]: E0904 01:00:04.972347 3273 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 01:00:04.973449 kubelet[3273]: I0904 01:00:04.973432 3273 factory.go:223] Registration of the systemd container factory successfully Sep 4 01:00:04.973527 kubelet[3273]: I0904 01:00:04.973490 3273 server.go:317] "Adding debug handlers to kubelet server" Sep 4 01:00:04.973527 kubelet[3273]: I0904 01:00:04.973508 3273 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 01:00:04.974556 kubelet[3273]: I0904 01:00:04.974537 3273 factory.go:223] Registration of the containerd container factory successfully Sep 4 01:00:04.976827 kubelet[3273]: I0904 01:00:04.976800 3273 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 4 01:00:04.981483 kubelet[3273]: I0904 01:00:04.981466 3273 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 4 01:00:04.981483 kubelet[3273]: I0904 01:00:04.981483 3273 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 4 01:00:04.981634 kubelet[3273]: I0904 01:00:04.981497 3273 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 01:00:04.981634 kubelet[3273]: I0904 01:00:04.981503 3273 kubelet.go:2436] "Starting kubelet main sync loop" Sep 4 01:00:04.981634 kubelet[3273]: E0904 01:00:04.981534 3273 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 01:00:04.992536 kubelet[3273]: I0904 01:00:04.992522 3273 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 01:00:04.992536 kubelet[3273]: I0904 01:00:04.992531 3273 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 01:00:04.992536 kubelet[3273]: I0904 01:00:04.992542 3273 state_mem.go:36] "Initialized new in-memory state store" Sep 4 01:00:04.992665 kubelet[3273]: I0904 01:00:04.992620 3273 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 01:00:04.992665 kubelet[3273]: I0904 01:00:04.992626 3273 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 01:00:04.992665 kubelet[3273]: I0904 01:00:04.992640 3273 policy_none.go:49] "None policy: Start" Sep 4 01:00:04.992665 kubelet[3273]: I0904 01:00:04.992645 3273 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 01:00:04.992665 kubelet[3273]: I0904 01:00:04.992651 3273 state_mem.go:35] "Initializing new in-memory state store" Sep 4 01:00:04.992742 kubelet[3273]: I0904 01:00:04.992703 3273 state_mem.go:75] "Updated machine memory state" Sep 4 01:00:04.994742 kubelet[3273]: E0904 01:00:04.994731 3273 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 4 01:00:04.994852 kubelet[3273]: I0904 01:00:04.994844 3273 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 01:00:04.994887 kubelet[3273]: I0904 01:00:04.994851 3273 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 01:00:04.994944 kubelet[3273]: I0904 01:00:04.994936 3273 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 01:00:04.995242 kubelet[3273]: E0904 01:00:04.995233 3273 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 01:00:05.085717 kubelet[3273]: I0904 01:00:05.085065 3273 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.085717 kubelet[3273]: I0904 01:00:05.085192 3273 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.086418 kubelet[3273]: I0904 01:00:05.086006 3273 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.092983 kubelet[3273]: I0904 01:00:05.092905 3273 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 4 01:00:05.093962 kubelet[3273]: I0904 01:00:05.093910 3273 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 4 01:00:05.094314 kubelet[3273]: I0904 01:00:05.094271 3273 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 4 01:00:05.099989 kubelet[3273]: I0904 01:00:05.099897 3273 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.108894 kubelet[3273]: I0904 01:00:05.108809 3273 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.109081 kubelet[3273]: I0904 01:00:05.108960 3273 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172494 kubelet[3273]: I0904 01:00:05.172252 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172494 kubelet[3273]: I0904 01:00:05.172349 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172494 kubelet[3273]: I0904 01:00:05.172421 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172969 kubelet[3273]: I0904 01:00:05.172542 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172969 kubelet[3273]: I0904 01:00:05.172662 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8f2f9bcd218aa96c3b33aaf6b645a9ff-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" (UID: \"8f2f9bcd218aa96c3b33aaf6b645a9ff\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172969 kubelet[3273]: I0904 01:00:05.172779 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172969 kubelet[3273]: I0904 01:00:05.172840 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.172969 kubelet[3273]: I0904 01:00:05.172892 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c35b01e0d8394919006dd99e480a2620-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" (UID: \"c35b01e0d8394919006dd99e480a2620\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.173572 kubelet[3273]: I0904 01:00:05.172957 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ee14763a409cb98df895e074db1cb54e-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-d84d506c1c\" (UID: \"ee14763a409cb98df895e074db1cb54e\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.966268 kubelet[3273]: I0904 01:00:05.966221 3273 apiserver.go:52] "Watching apiserver" Sep 4 01:00:05.972072 kubelet[3273]: I0904 01:00:05.972058 3273 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 01:00:05.987406 kubelet[3273]: I0904 01:00:05.987389 3273 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.987494 kubelet[3273]: I0904 01:00:05.987427 3273 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.989984 kubelet[3273]: I0904 01:00:05.989969 3273 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 4 01:00:05.990053 kubelet[3273]: E0904 01:00:05.989999 3273 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372.1.0-n-d84d506c1c\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.990150 kubelet[3273]: I0904 01:00:05.990141 3273 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 4 01:00:05.990187 kubelet[3273]: E0904 01:00:05.990159 3273 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372.1.0-n-d84d506c1c\" already exists" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" Sep 4 01:00:05.998807 kubelet[3273]: I0904 01:00:05.998724 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-d84d506c1c" podStartSLOduration=0.998714088 podStartE2EDuration="998.714088ms" podCreationTimestamp="2025-09-04 01:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:00:05.99866091 +0000 UTC m=+1.092534811" watchObservedRunningTime="2025-09-04 01:00:05.998714088 +0000 UTC m=+1.092587991" Sep 4 01:00:06.002864 kubelet[3273]: I0904 01:00:06.002809 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-n-d84d506c1c" podStartSLOduration=1.002800071 podStartE2EDuration="1.002800071s" podCreationTimestamp="2025-09-04 01:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:00:06.002761544 +0000 UTC m=+1.096635446" watchObservedRunningTime="2025-09-04 01:00:06.002800071 +0000 UTC m=+1.096673973" Sep 4 01:00:06.015490 kubelet[3273]: I0904 01:00:06.015456 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-n-d84d506c1c" podStartSLOduration=1.015444869 podStartE2EDuration="1.015444869s" podCreationTimestamp="2025-09-04 01:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:00:06.00650496 +0000 UTC m=+1.100378861" watchObservedRunningTime="2025-09-04 01:00:06.015444869 +0000 UTC m=+1.109318767" Sep 4 01:00:11.304680 kubelet[3273]: I0904 01:00:11.304602 3273 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 01:00:11.305843 containerd[1911]: time="2025-09-04T01:00:11.305383275Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 01:00:11.306699 kubelet[3273]: I0904 01:00:11.305898 3273 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 01:00:12.068216 systemd[1]: Created slice kubepods-besteffort-podaa0808ee_93d7_4e8d_8292_33a6509999c3.slice - libcontainer container kubepods-besteffort-podaa0808ee_93d7_4e8d_8292_33a6509999c3.slice. Sep 4 01:00:12.121484 kubelet[3273]: I0904 01:00:12.121400 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/aa0808ee-93d7-4e8d-8292-33a6509999c3-kube-proxy\") pod \"kube-proxy-6cfml\" (UID: \"aa0808ee-93d7-4e8d-8292-33a6509999c3\") " pod="kube-system/kube-proxy-6cfml" Sep 4 01:00:12.121484 kubelet[3273]: I0904 01:00:12.121470 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aa0808ee-93d7-4e8d-8292-33a6509999c3-xtables-lock\") pod \"kube-proxy-6cfml\" (UID: \"aa0808ee-93d7-4e8d-8292-33a6509999c3\") " pod="kube-system/kube-proxy-6cfml" Sep 4 01:00:12.121823 kubelet[3273]: I0904 01:00:12.121504 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa0808ee-93d7-4e8d-8292-33a6509999c3-lib-modules\") pod \"kube-proxy-6cfml\" (UID: \"aa0808ee-93d7-4e8d-8292-33a6509999c3\") " pod="kube-system/kube-proxy-6cfml" Sep 4 01:00:12.121823 kubelet[3273]: I0904 01:00:12.121537 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tpv\" (UniqueName: \"kubernetes.io/projected/aa0808ee-93d7-4e8d-8292-33a6509999c3-kube-api-access-b5tpv\") pod \"kube-proxy-6cfml\" (UID: \"aa0808ee-93d7-4e8d-8292-33a6509999c3\") " pod="kube-system/kube-proxy-6cfml" Sep 4 01:00:12.380282 containerd[1911]: time="2025-09-04T01:00:12.380047333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6cfml,Uid:aa0808ee-93d7-4e8d-8292-33a6509999c3,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:12.388360 containerd[1911]: time="2025-09-04T01:00:12.388329989Z" level=info msg="connecting to shim e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99" address="unix:///run/containerd/s/5514764eeaccfd19775423b0b636d85bd42897b2d72622584d210b6abe86aaa6" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:12.408272 systemd[1]: Started cri-containerd-e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99.scope - libcontainer container e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99. Sep 4 01:00:12.423685 systemd[1]: Created slice kubepods-besteffort-pod71f9ebf5_8380_4673_8c65_7de8d7f3b34a.slice - libcontainer container kubepods-besteffort-pod71f9ebf5_8380_4673_8c65_7de8d7f3b34a.slice. Sep 4 01:00:12.425198 kubelet[3273]: I0904 01:00:12.425105 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88z2m\" (UniqueName: \"kubernetes.io/projected/71f9ebf5-8380-4673-8c65-7de8d7f3b34a-kube-api-access-88z2m\") pod \"tigera-operator-755d956888-qrncf\" (UID: \"71f9ebf5-8380-4673-8c65-7de8d7f3b34a\") " pod="tigera-operator/tigera-operator-755d956888-qrncf" Sep 4 01:00:12.426176 kubelet[3273]: I0904 01:00:12.425225 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/71f9ebf5-8380-4673-8c65-7de8d7f3b34a-var-lib-calico\") pod \"tigera-operator-755d956888-qrncf\" (UID: \"71f9ebf5-8380-4673-8c65-7de8d7f3b34a\") " pod="tigera-operator/tigera-operator-755d956888-qrncf" Sep 4 01:00:12.471022 containerd[1911]: time="2025-09-04T01:00:12.470973196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6cfml,Uid:aa0808ee-93d7-4e8d-8292-33a6509999c3,Namespace:kube-system,Attempt:0,} returns sandbox id \"e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99\"" Sep 4 01:00:12.472695 containerd[1911]: time="2025-09-04T01:00:12.472680439Z" level=info msg="CreateContainer within sandbox \"e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 01:00:12.476398 containerd[1911]: time="2025-09-04T01:00:12.476382922Z" level=info msg="Container fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:12.479284 containerd[1911]: time="2025-09-04T01:00:12.479243142Z" level=info msg="CreateContainer within sandbox \"e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd\"" Sep 4 01:00:12.479494 containerd[1911]: time="2025-09-04T01:00:12.479472397Z" level=info msg="StartContainer for \"fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd\"" Sep 4 01:00:12.480193 containerd[1911]: time="2025-09-04T01:00:12.480153313Z" level=info msg="connecting to shim fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd" address="unix:///run/containerd/s/5514764eeaccfd19775423b0b636d85bd42897b2d72622584d210b6abe86aaa6" protocol=ttrpc version=3 Sep 4 01:00:12.500928 systemd[1]: Started cri-containerd-fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd.scope - libcontainer container fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd. Sep 4 01:00:12.523452 containerd[1911]: time="2025-09-04T01:00:12.523426691Z" level=info msg="StartContainer for \"fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd\" returns successfully" Sep 4 01:00:12.731526 containerd[1911]: time="2025-09-04T01:00:12.731431651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qrncf,Uid:71f9ebf5-8380-4673-8c65-7de8d7f3b34a,Namespace:tigera-operator,Attempt:0,}" Sep 4 01:00:12.739247 containerd[1911]: time="2025-09-04T01:00:12.739193674Z" level=info msg="connecting to shim 65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f" address="unix:///run/containerd/s/476300a2867d8432fae44a51ffeae838bc6e1e94427a0e93bcaab5bb61d3009a" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:12.761890 systemd[1]: Started cri-containerd-65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f.scope - libcontainer container 65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f. Sep 4 01:00:12.794447 containerd[1911]: time="2025-09-04T01:00:12.794427069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qrncf,Uid:71f9ebf5-8380-4673-8c65-7de8d7f3b34a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f\"" Sep 4 01:00:12.795082 containerd[1911]: time="2025-09-04T01:00:12.795069106Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 01:00:14.626231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount67974806.mount: Deactivated successfully. Sep 4 01:00:15.194305 containerd[1911]: time="2025-09-04T01:00:15.194252836Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:15.194506 containerd[1911]: time="2025-09-04T01:00:15.194465894Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 01:00:15.194805 containerd[1911]: time="2025-09-04T01:00:15.194775133Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:15.195660 containerd[1911]: time="2025-09-04T01:00:15.195619027Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:15.196059 containerd[1911]: time="2025-09-04T01:00:15.196017374Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.400932273s" Sep 4 01:00:15.196059 containerd[1911]: time="2025-09-04T01:00:15.196032340Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 01:00:15.197418 containerd[1911]: time="2025-09-04T01:00:15.197405648Z" level=info msg="CreateContainer within sandbox \"65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 01:00:15.200049 containerd[1911]: time="2025-09-04T01:00:15.200035553Z" level=info msg="Container ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:15.201985 containerd[1911]: time="2025-09-04T01:00:15.201969629Z" level=info msg="CreateContainer within sandbox \"65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\"" Sep 4 01:00:15.202244 containerd[1911]: time="2025-09-04T01:00:15.202212008Z" level=info msg="StartContainer for \"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\"" Sep 4 01:00:15.202609 containerd[1911]: time="2025-09-04T01:00:15.202573331Z" level=info msg="connecting to shim ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a" address="unix:///run/containerd/s/476300a2867d8432fae44a51ffeae838bc6e1e94427a0e93bcaab5bb61d3009a" protocol=ttrpc version=3 Sep 4 01:00:15.219956 systemd[1]: Started cri-containerd-ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a.scope - libcontainer container ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a. Sep 4 01:00:15.280361 containerd[1911]: time="2025-09-04T01:00:15.280319068Z" level=info msg="StartContainer for \"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\" returns successfully" Sep 4 01:00:16.037379 kubelet[3273]: I0904 01:00:16.037206 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6cfml" podStartSLOduration=4.037168597 podStartE2EDuration="4.037168597s" podCreationTimestamp="2025-09-04 01:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:00:13.025379796 +0000 UTC m=+8.119253765" watchObservedRunningTime="2025-09-04 01:00:16.037168597 +0000 UTC m=+11.131042549" Sep 4 01:00:16.038327 kubelet[3273]: I0904 01:00:16.037521 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-qrncf" podStartSLOduration=1.6360132630000002 podStartE2EDuration="4.037497279s" podCreationTimestamp="2025-09-04 01:00:12 +0000 UTC" firstStartedPulling="2025-09-04 01:00:12.794919468 +0000 UTC m=+7.888793365" lastFinishedPulling="2025-09-04 01:00:15.19640348 +0000 UTC m=+10.290277381" observedRunningTime="2025-09-04 01:00:16.037437376 +0000 UTC m=+11.131311351" watchObservedRunningTime="2025-09-04 01:00:16.037497279 +0000 UTC m=+11.131371234" Sep 4 01:00:16.657344 systemd[1]: cri-containerd-ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a.scope: Deactivated successfully. Sep 4 01:00:16.658642 containerd[1911]: time="2025-09-04T01:00:16.658472435Z" level=info msg="received exit event container_id:\"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\" id:\"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\" pid:3648 exit_status:1 exited_at:{seconds:1756947616 nanos:658094490}" Sep 4 01:00:16.658642 containerd[1911]: time="2025-09-04T01:00:16.658621879Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\" id:\"ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a\" pid:3648 exit_status:1 exited_at:{seconds:1756947616 nanos:658094490}" Sep 4 01:00:16.672787 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a-rootfs.mount: Deactivated successfully. Sep 4 01:00:16.852882 update_engine[1905]: I20250904 01:00:16.852823 1905 update_attempter.cc:509] Updating boot flags... Sep 4 01:00:18.028465 kubelet[3273]: I0904 01:00:18.028357 3273 scope.go:117] "RemoveContainer" containerID="ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a" Sep 4 01:00:18.031920 containerd[1911]: time="2025-09-04T01:00:18.031829357Z" level=info msg="CreateContainer within sandbox \"65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 4 01:00:18.037322 containerd[1911]: time="2025-09-04T01:00:18.037278260Z" level=info msg="Container dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:18.040091 containerd[1911]: time="2025-09-04T01:00:18.040067845Z" level=info msg="CreateContainer within sandbox \"65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670\"" Sep 4 01:00:18.040396 containerd[1911]: time="2025-09-04T01:00:18.040355997Z" level=info msg="StartContainer for \"dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670\"" Sep 4 01:00:18.040920 containerd[1911]: time="2025-09-04T01:00:18.040909807Z" level=info msg="connecting to shim dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670" address="unix:///run/containerd/s/476300a2867d8432fae44a51ffeae838bc6e1e94427a0e93bcaab5bb61d3009a" protocol=ttrpc version=3 Sep 4 01:00:18.061942 systemd[1]: Started cri-containerd-dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670.scope - libcontainer container dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670. Sep 4 01:00:18.075573 containerd[1911]: time="2025-09-04T01:00:18.075553974Z" level=info msg="StartContainer for \"dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670\" returns successfully" Sep 4 01:00:19.584760 sudo[2216]: pam_unix(sudo:session): session closed for user root Sep 4 01:00:19.585462 sshd[2215]: Connection closed by 147.75.109.163 port 38500 Sep 4 01:00:19.585613 sshd-session[2213]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:19.587163 systemd[1]: sshd@8-147.75.202.229:22-147.75.109.163:38500.service: Deactivated successfully. Sep 4 01:00:19.588122 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 01:00:19.588217 systemd[1]: session-11.scope: Consumed 4.283s CPU time, 236.6M memory peak. Sep 4 01:00:19.589212 systemd-logind[1900]: Session 11 logged out. Waiting for processes to exit. Sep 4 01:00:19.589893 systemd-logind[1900]: Removed session 11. Sep 4 01:00:23.190707 systemd[1]: Created slice kubepods-besteffort-podbdef5de7_06c2_4880_b6ea_8be85117f7b5.slice - libcontainer container kubepods-besteffort-podbdef5de7_06c2_4880_b6ea_8be85117f7b5.slice. Sep 4 01:00:23.201858 kubelet[3273]: I0904 01:00:23.201820 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6pg\" (UniqueName: \"kubernetes.io/projected/bdef5de7-06c2-4880-b6ea-8be85117f7b5-kube-api-access-2p6pg\") pod \"calico-typha-7d78fdd59-h92dm\" (UID: \"bdef5de7-06c2-4880-b6ea-8be85117f7b5\") " pod="calico-system/calico-typha-7d78fdd59-h92dm" Sep 4 01:00:23.201858 kubelet[3273]: I0904 01:00:23.201861 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdef5de7-06c2-4880-b6ea-8be85117f7b5-tigera-ca-bundle\") pod \"calico-typha-7d78fdd59-h92dm\" (UID: \"bdef5de7-06c2-4880-b6ea-8be85117f7b5\") " pod="calico-system/calico-typha-7d78fdd59-h92dm" Sep 4 01:00:23.202369 kubelet[3273]: I0904 01:00:23.201882 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bdef5de7-06c2-4880-b6ea-8be85117f7b5-typha-certs\") pod \"calico-typha-7d78fdd59-h92dm\" (UID: \"bdef5de7-06c2-4880-b6ea-8be85117f7b5\") " pod="calico-system/calico-typha-7d78fdd59-h92dm" Sep 4 01:00:23.494927 containerd[1911]: time="2025-09-04T01:00:23.494841481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d78fdd59-h92dm,Uid:bdef5de7-06c2-4880-b6ea-8be85117f7b5,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:23.502323 containerd[1911]: time="2025-09-04T01:00:23.502271330Z" level=info msg="connecting to shim e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516" address="unix:///run/containerd/s/8dbc4dd7872ef22d4b4357d74165cf7892b6e4e5d031dbaf7f723f887b68d94d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:23.531920 systemd[1]: Started cri-containerd-e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516.scope - libcontainer container e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516. Sep 4 01:00:23.556745 containerd[1911]: time="2025-09-04T01:00:23.556723764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d78fdd59-h92dm,Uid:bdef5de7-06c2-4880-b6ea-8be85117f7b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516\"" Sep 4 01:00:23.557364 containerd[1911]: time="2025-09-04T01:00:23.557321035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 01:00:23.580564 systemd[1]: Created slice kubepods-besteffort-pod6f233ec0_99aa_4bb6_a1b0_7adbe186fe31.slice - libcontainer container kubepods-besteffort-pod6f233ec0_99aa_4bb6_a1b0_7adbe186fe31.slice. Sep 4 01:00:23.605181 kubelet[3273]: I0904 01:00:23.605074 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-flexvol-driver-host\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605181 kubelet[3273]: I0904 01:00:23.605174 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-cni-log-dir\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605525 kubelet[3273]: I0904 01:00:23.605231 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-lib-modules\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605525 kubelet[3273]: I0904 01:00:23.605284 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-node-certs\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605525 kubelet[3273]: I0904 01:00:23.605410 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdrkl\" (UniqueName: \"kubernetes.io/projected/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-kube-api-access-hdrkl\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605525 kubelet[3273]: I0904 01:00:23.605505 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-var-lib-calico\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605958 kubelet[3273]: I0904 01:00:23.605570 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-cni-bin-dir\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605958 kubelet[3273]: I0904 01:00:23.605618 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-cni-net-dir\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605958 kubelet[3273]: I0904 01:00:23.605668 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-policysync\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605958 kubelet[3273]: I0904 01:00:23.605719 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-var-run-calico\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.605958 kubelet[3273]: I0904 01:00:23.605796 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-tigera-ca-bundle\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.606513 kubelet[3273]: I0904 01:00:23.605885 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6f233ec0-99aa-4bb6-a1b0-7adbe186fe31-xtables-lock\") pod \"calico-node-5wqj7\" (UID: \"6f233ec0-99aa-4bb6-a1b0-7adbe186fe31\") " pod="calico-system/calico-node-5wqj7" Sep 4 01:00:23.707362 kubelet[3273]: E0904 01:00:23.707338 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.707362 kubelet[3273]: W0904 01:00:23.707355 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.707518 kubelet[3273]: E0904 01:00:23.707374 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.707686 kubelet[3273]: E0904 01:00:23.707675 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.707686 kubelet[3273]: W0904 01:00:23.707686 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.707787 kubelet[3273]: E0904 01:00:23.707696 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.709573 kubelet[3273]: E0904 01:00:23.709536 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.709573 kubelet[3273]: W0904 01:00:23.709545 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.709573 kubelet[3273]: E0904 01:00:23.709554 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.713593 kubelet[3273]: E0904 01:00:23.713578 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.713593 kubelet[3273]: W0904 01:00:23.713591 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.713685 kubelet[3273]: E0904 01:00:23.713606 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.857441 kubelet[3273]: E0904 01:00:23.857279 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:23.882488 containerd[1911]: time="2025-09-04T01:00:23.882419117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5wqj7,Uid:6f233ec0-99aa-4bb6-a1b0-7adbe186fe31,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:23.889726 containerd[1911]: time="2025-09-04T01:00:23.889703056Z" level=info msg="connecting to shim 79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563" address="unix:///run/containerd/s/90cda3c5fef613ada18aa40b19f0d20894165878a4172eef483fe0625d47fb2d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:00:23.889888 kubelet[3273]: E0904 01:00:23.889875 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.889888 kubelet[3273]: W0904 01:00:23.889887 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.889955 kubelet[3273]: E0904 01:00:23.889899 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890062 kubelet[3273]: E0904 01:00:23.890008 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890062 kubelet[3273]: W0904 01:00:23.890013 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890062 kubelet[3273]: E0904 01:00:23.890033 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890179 kubelet[3273]: E0904 01:00:23.890119 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890179 kubelet[3273]: W0904 01:00:23.890138 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890179 kubelet[3273]: E0904 01:00:23.890143 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890300 kubelet[3273]: E0904 01:00:23.890252 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890300 kubelet[3273]: W0904 01:00:23.890273 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890300 kubelet[3273]: E0904 01:00:23.890277 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890423 kubelet[3273]: E0904 01:00:23.890371 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890423 kubelet[3273]: W0904 01:00:23.890376 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890423 kubelet[3273]: E0904 01:00:23.890395 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890536 kubelet[3273]: E0904 01:00:23.890506 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890536 kubelet[3273]: W0904 01:00:23.890511 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890536 kubelet[3273]: E0904 01:00:23.890516 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890588 kubelet[3273]: E0904 01:00:23.890577 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890588 kubelet[3273]: W0904 01:00:23.890582 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890588 kubelet[3273]: E0904 01:00:23.890586 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890653 kubelet[3273]: E0904 01:00:23.890648 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890653 kubelet[3273]: W0904 01:00:23.890652 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890687 kubelet[3273]: E0904 01:00:23.890656 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890725 kubelet[3273]: E0904 01:00:23.890720 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890725 kubelet[3273]: W0904 01:00:23.890724 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890767 kubelet[3273]: E0904 01:00:23.890730 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890822 kubelet[3273]: E0904 01:00:23.890817 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890822 kubelet[3273]: W0904 01:00:23.890822 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890858 kubelet[3273]: E0904 01:00:23.890826 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890894 kubelet[3273]: E0904 01:00:23.890889 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890894 kubelet[3273]: W0904 01:00:23.890893 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.890928 kubelet[3273]: E0904 01:00:23.890898 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.890966 kubelet[3273]: E0904 01:00:23.890961 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.890966 kubelet[3273]: W0904 01:00:23.890965 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891001 kubelet[3273]: E0904 01:00:23.890969 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891049 kubelet[3273]: E0904 01:00:23.891044 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891067 kubelet[3273]: W0904 01:00:23.891049 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891067 kubelet[3273]: E0904 01:00:23.891053 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891125 kubelet[3273]: E0904 01:00:23.891120 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891125 kubelet[3273]: W0904 01:00:23.891125 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891211 kubelet[3273]: E0904 01:00:23.891129 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891277 kubelet[3273]: E0904 01:00:23.891272 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891277 kubelet[3273]: W0904 01:00:23.891276 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891310 kubelet[3273]: E0904 01:00:23.891280 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891348 kubelet[3273]: E0904 01:00:23.891343 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891348 kubelet[3273]: W0904 01:00:23.891347 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891384 kubelet[3273]: E0904 01:00:23.891351 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891429 kubelet[3273]: E0904 01:00:23.891425 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891429 kubelet[3273]: W0904 01:00:23.891429 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891464 kubelet[3273]: E0904 01:00:23.891433 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891504 kubelet[3273]: E0904 01:00:23.891499 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891504 kubelet[3273]: W0904 01:00:23.891503 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891538 kubelet[3273]: E0904 01:00:23.891507 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891578 kubelet[3273]: E0904 01:00:23.891572 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891598 kubelet[3273]: W0904 01:00:23.891578 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891598 kubelet[3273]: E0904 01:00:23.891582 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.891648 kubelet[3273]: E0904 01:00:23.891643 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.891648 kubelet[3273]: W0904 01:00:23.891648 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.891683 kubelet[3273]: E0904 01:00:23.891652 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909202 kubelet[3273]: E0904 01:00:23.909165 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909202 kubelet[3273]: W0904 01:00:23.909177 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909202 kubelet[3273]: E0904 01:00:23.909188 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909202 kubelet[3273]: I0904 01:00:23.909204 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/427476d1-8991-4f95-a430-df7e5fb9ed65-varrun\") pod \"csi-node-driver-89qxc\" (UID: \"427476d1-8991-4f95-a430-df7e5fb9ed65\") " pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:23.909364 kubelet[3273]: E0904 01:00:23.909329 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909364 kubelet[3273]: W0904 01:00:23.909336 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909364 kubelet[3273]: E0904 01:00:23.909342 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909364 kubelet[3273]: I0904 01:00:23.909353 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/427476d1-8991-4f95-a430-df7e5fb9ed65-registration-dir\") pod \"csi-node-driver-89qxc\" (UID: \"427476d1-8991-4f95-a430-df7e5fb9ed65\") " pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:23.909513 kubelet[3273]: E0904 01:00:23.909478 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909513 kubelet[3273]: W0904 01:00:23.909487 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909513 kubelet[3273]: E0904 01:00:23.909493 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909599 kubelet[3273]: E0904 01:00:23.909594 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909619 kubelet[3273]: W0904 01:00:23.909599 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909619 kubelet[3273]: E0904 01:00:23.909604 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909696 kubelet[3273]: E0904 01:00:23.909690 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909717 kubelet[3273]: W0904 01:00:23.909696 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909717 kubelet[3273]: E0904 01:00:23.909700 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.909717 kubelet[3273]: I0904 01:00:23.909711 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/427476d1-8991-4f95-a430-df7e5fb9ed65-kubelet-dir\") pod \"csi-node-driver-89qxc\" (UID: \"427476d1-8991-4f95-a430-df7e5fb9ed65\") " pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:23.909878 kubelet[3273]: E0904 01:00:23.909869 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.909900 kubelet[3273]: W0904 01:00:23.909879 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.909900 kubelet[3273]: E0904 01:00:23.909887 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910010 kubelet[3273]: E0904 01:00:23.910003 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910030 kubelet[3273]: W0904 01:00:23.910010 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910030 kubelet[3273]: E0904 01:00:23.910017 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910145 kubelet[3273]: E0904 01:00:23.910138 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910166 kubelet[3273]: W0904 01:00:23.910146 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910166 kubelet[3273]: E0904 01:00:23.910153 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910204 kubelet[3273]: I0904 01:00:23.910168 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/427476d1-8991-4f95-a430-df7e5fb9ed65-socket-dir\") pod \"csi-node-driver-89qxc\" (UID: \"427476d1-8991-4f95-a430-df7e5fb9ed65\") " pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:23.910249 kubelet[3273]: E0904 01:00:23.910243 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910271 kubelet[3273]: W0904 01:00:23.910249 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910271 kubelet[3273]: E0904 01:00:23.910255 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910328 kubelet[3273]: E0904 01:00:23.910323 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910350 kubelet[3273]: W0904 01:00:23.910328 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910350 kubelet[3273]: E0904 01:00:23.910334 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910411 kubelet[3273]: E0904 01:00:23.910405 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910411 kubelet[3273]: W0904 01:00:23.910410 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910445 kubelet[3273]: E0904 01:00:23.910415 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910445 kubelet[3273]: I0904 01:00:23.910426 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dnl\" (UniqueName: \"kubernetes.io/projected/427476d1-8991-4f95-a430-df7e5fb9ed65-kube-api-access-r2dnl\") pod \"csi-node-driver-89qxc\" (UID: \"427476d1-8991-4f95-a430-df7e5fb9ed65\") " pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:23.910553 kubelet[3273]: E0904 01:00:23.910545 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910573 kubelet[3273]: W0904 01:00:23.910553 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910573 kubelet[3273]: E0904 01:00:23.910560 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910640 kubelet[3273]: E0904 01:00:23.910635 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910640 kubelet[3273]: W0904 01:00:23.910640 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910674 kubelet[3273]: E0904 01:00:23.910645 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910729 kubelet[3273]: E0904 01:00:23.910724 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910754 kubelet[3273]: W0904 01:00:23.910729 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910754 kubelet[3273]: E0904 01:00:23.910734 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.910811 kubelet[3273]: E0904 01:00:23.910806 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:23.910811 kubelet[3273]: W0904 01:00:23.910811 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:23.910849 kubelet[3273]: E0904 01:00:23.910815 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:23.911857 systemd[1]: Started cri-containerd-79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563.scope - libcontainer container 79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563. Sep 4 01:00:23.924674 containerd[1911]: time="2025-09-04T01:00:23.924650469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5wqj7,Uid:6f233ec0-99aa-4bb6-a1b0-7adbe186fe31,Namespace:calico-system,Attempt:0,} returns sandbox id \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\"" Sep 4 01:00:24.012158 kubelet[3273]: E0904 01:00:24.012054 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.012158 kubelet[3273]: W0904 01:00:24.012103 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.012158 kubelet[3273]: E0904 01:00:24.012143 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.012882 kubelet[3273]: E0904 01:00:24.012801 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.012882 kubelet[3273]: W0904 01:00:24.012845 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.013178 kubelet[3273]: E0904 01:00:24.012905 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.013783 kubelet[3273]: E0904 01:00:24.013673 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.013783 kubelet[3273]: W0904 01:00:24.013724 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.014077 kubelet[3273]: E0904 01:00:24.013809 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.014397 kubelet[3273]: E0904 01:00:24.014316 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.014397 kubelet[3273]: W0904 01:00:24.014357 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.014397 kubelet[3273]: E0904 01:00:24.014398 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.015024 kubelet[3273]: E0904 01:00:24.014925 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.015024 kubelet[3273]: W0904 01:00:24.014964 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.015024 kubelet[3273]: E0904 01:00:24.015004 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.015621 kubelet[3273]: E0904 01:00:24.015560 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.015621 kubelet[3273]: W0904 01:00:24.015599 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.015864 kubelet[3273]: E0904 01:00:24.015640 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.016199 kubelet[3273]: E0904 01:00:24.016168 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.016323 kubelet[3273]: W0904 01:00:24.016196 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.016323 kubelet[3273]: E0904 01:00:24.016224 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.016739 kubelet[3273]: E0904 01:00:24.016705 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.016888 kubelet[3273]: W0904 01:00:24.016745 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.016888 kubelet[3273]: E0904 01:00:24.016809 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.017263 kubelet[3273]: E0904 01:00:24.017234 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.017263 kubelet[3273]: W0904 01:00:24.017260 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.017481 kubelet[3273]: E0904 01:00:24.017292 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.017740 kubelet[3273]: E0904 01:00:24.017709 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.017740 kubelet[3273]: W0904 01:00:24.017734 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.018010 kubelet[3273]: E0904 01:00:24.017787 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.018236 kubelet[3273]: E0904 01:00:24.018198 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.018236 kubelet[3273]: W0904 01:00:24.018225 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.018519 kubelet[3273]: E0904 01:00:24.018249 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.018664 kubelet[3273]: E0904 01:00:24.018637 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.018816 kubelet[3273]: W0904 01:00:24.018669 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.018816 kubelet[3273]: E0904 01:00:24.018707 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.019318 kubelet[3273]: E0904 01:00:24.019271 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.019504 kubelet[3273]: W0904 01:00:24.019316 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.019504 kubelet[3273]: E0904 01:00:24.019357 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.019880 kubelet[3273]: E0904 01:00:24.019820 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.019880 kubelet[3273]: W0904 01:00:24.019857 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.020086 kubelet[3273]: E0904 01:00:24.019900 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.020513 kubelet[3273]: E0904 01:00:24.020429 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.020513 kubelet[3273]: W0904 01:00:24.020470 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.020513 kubelet[3273]: E0904 01:00:24.020507 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.021104 kubelet[3273]: E0904 01:00:24.021032 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.021104 kubelet[3273]: W0904 01:00:24.021070 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.021320 kubelet[3273]: E0904 01:00:24.021109 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.021638 kubelet[3273]: E0904 01:00:24.021606 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.021791 kubelet[3273]: W0904 01:00:24.021642 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.021791 kubelet[3273]: E0904 01:00:24.021680 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.022231 kubelet[3273]: E0904 01:00:24.022199 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.022345 kubelet[3273]: W0904 01:00:24.022237 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.022345 kubelet[3273]: E0904 01:00:24.022276 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.022857 kubelet[3273]: E0904 01:00:24.022801 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.022857 kubelet[3273]: W0904 01:00:24.022840 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.023085 kubelet[3273]: E0904 01:00:24.022878 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.023500 kubelet[3273]: E0904 01:00:24.023447 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.023500 kubelet[3273]: W0904 01:00:24.023486 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.023712 kubelet[3273]: E0904 01:00:24.023527 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.024103 kubelet[3273]: E0904 01:00:24.024044 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.024103 kubelet[3273]: W0904 01:00:24.024078 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.024333 kubelet[3273]: E0904 01:00:24.024118 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.024747 kubelet[3273]: E0904 01:00:24.024709 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.024892 kubelet[3273]: W0904 01:00:24.024746 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.024892 kubelet[3273]: E0904 01:00:24.024824 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.025512 kubelet[3273]: E0904 01:00:24.025445 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.025512 kubelet[3273]: W0904 01:00:24.025484 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.025833 kubelet[3273]: E0904 01:00:24.025528 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.026166 kubelet[3273]: E0904 01:00:24.026077 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.026166 kubelet[3273]: W0904 01:00:24.026120 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.026448 kubelet[3273]: E0904 01:00:24.026170 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.026935 kubelet[3273]: E0904 01:00:24.026872 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.026935 kubelet[3273]: W0904 01:00:24.026911 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.027179 kubelet[3273]: E0904 01:00:24.026956 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.046152 kubelet[3273]: E0904 01:00:24.046064 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:24.046152 kubelet[3273]: W0904 01:00:24.046115 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:24.046454 kubelet[3273]: E0904 01:00:24.046170 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:24.982847 kubelet[3273]: E0904 01:00:24.982728 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:26.982853 kubelet[3273]: E0904 01:00:26.982705 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:28.982867 kubelet[3273]: E0904 01:00:28.982729 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:30.982983 kubelet[3273]: E0904 01:00:30.982883 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:32.983239 kubelet[3273]: E0904 01:00:32.983107 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:34.983464 kubelet[3273]: E0904 01:00:34.983352 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:35.577073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2163796482.mount: Deactivated successfully. Sep 4 01:00:36.270025 containerd[1911]: time="2025-09-04T01:00:36.269976464Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:36.270228 containerd[1911]: time="2025-09-04T01:00:36.270182786Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 01:00:36.270883 containerd[1911]: time="2025-09-04T01:00:36.270862704Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:36.272483 containerd[1911]: time="2025-09-04T01:00:36.272443035Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:36.272694 containerd[1911]: time="2025-09-04T01:00:36.272659257Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 12.715323404s" Sep 4 01:00:36.272694 containerd[1911]: time="2025-09-04T01:00:36.272673770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 01:00:36.273102 containerd[1911]: time="2025-09-04T01:00:36.273059098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 01:00:36.276621 containerd[1911]: time="2025-09-04T01:00:36.276604643Z" level=info msg="CreateContainer within sandbox \"e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 01:00:36.279299 containerd[1911]: time="2025-09-04T01:00:36.279264229Z" level=info msg="Container 7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:36.281920 containerd[1911]: time="2025-09-04T01:00:36.281906562Z" level=info msg="CreateContainer within sandbox \"e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72\"" Sep 4 01:00:36.282142 containerd[1911]: time="2025-09-04T01:00:36.282084638Z" level=info msg="StartContainer for \"7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72\"" Sep 4 01:00:36.282594 containerd[1911]: time="2025-09-04T01:00:36.282581956Z" level=info msg="connecting to shim 7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72" address="unix:///run/containerd/s/8dbc4dd7872ef22d4b4357d74165cf7892b6e4e5d031dbaf7f723f887b68d94d" protocol=ttrpc version=3 Sep 4 01:00:36.301061 systemd[1]: Started cri-containerd-7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72.scope - libcontainer container 7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72. Sep 4 01:00:36.335879 containerd[1911]: time="2025-09-04T01:00:36.335854685Z" level=info msg="StartContainer for \"7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72\" returns successfully" Sep 4 01:00:36.982831 kubelet[3273]: E0904 01:00:36.982686 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:37.108546 kubelet[3273]: I0904 01:00:37.108426 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7d78fdd59-h92dm" podStartSLOduration=1.392592837 podStartE2EDuration="14.108390792s" podCreationTimestamp="2025-09-04 01:00:23 +0000 UTC" firstStartedPulling="2025-09-04 01:00:23.557215227 +0000 UTC m=+18.651089127" lastFinishedPulling="2025-09-04 01:00:36.273013184 +0000 UTC m=+31.366887082" observedRunningTime="2025-09-04 01:00:37.108323379 +0000 UTC m=+32.202197343" watchObservedRunningTime="2025-09-04 01:00:37.108390792 +0000 UTC m=+32.202264765" Sep 4 01:00:37.185001 kubelet[3273]: E0904 01:00:37.184933 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.185001 kubelet[3273]: W0904 01:00:37.184996 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.185492 kubelet[3273]: E0904 01:00:37.185055 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.185678 kubelet[3273]: E0904 01:00:37.185495 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.185678 kubelet[3273]: W0904 01:00:37.185530 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.185678 kubelet[3273]: E0904 01:00:37.185573 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.186242 kubelet[3273]: E0904 01:00:37.186081 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.186242 kubelet[3273]: W0904 01:00:37.186119 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.186242 kubelet[3273]: E0904 01:00:37.186161 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.186855 kubelet[3273]: E0904 01:00:37.186811 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.186855 kubelet[3273]: W0904 01:00:37.186846 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.187225 kubelet[3273]: E0904 01:00:37.186887 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.187433 kubelet[3273]: E0904 01:00:37.187380 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.187636 kubelet[3273]: W0904 01:00:37.187433 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.187636 kubelet[3273]: E0904 01:00:37.187476 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.188025 kubelet[3273]: E0904 01:00:37.187984 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.188025 kubelet[3273]: W0904 01:00:37.188014 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.188366 kubelet[3273]: E0904 01:00:37.188055 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.188566 kubelet[3273]: E0904 01:00:37.188527 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.188566 kubelet[3273]: W0904 01:00:37.188562 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.188922 kubelet[3273]: E0904 01:00:37.188597 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.189154 kubelet[3273]: E0904 01:00:37.189117 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.189154 kubelet[3273]: W0904 01:00:37.189149 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.189489 kubelet[3273]: E0904 01:00:37.189184 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.189725 kubelet[3273]: E0904 01:00:37.189688 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.189725 kubelet[3273]: W0904 01:00:37.189718 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.190069 kubelet[3273]: E0904 01:00:37.189768 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.190288 kubelet[3273]: E0904 01:00:37.190252 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.190288 kubelet[3273]: W0904 01:00:37.190284 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.190626 kubelet[3273]: E0904 01:00:37.190321 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.190868 kubelet[3273]: E0904 01:00:37.190820 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.190868 kubelet[3273]: W0904 01:00:37.190865 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.191194 kubelet[3273]: E0904 01:00:37.190901 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.191432 kubelet[3273]: E0904 01:00:37.191395 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.191432 kubelet[3273]: W0904 01:00:37.191428 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.191800 kubelet[3273]: E0904 01:00:37.191466 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.192041 kubelet[3273]: E0904 01:00:37.192003 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.192041 kubelet[3273]: W0904 01:00:37.192037 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.192353 kubelet[3273]: E0904 01:00:37.192075 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.192594 kubelet[3273]: E0904 01:00:37.192557 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.192594 kubelet[3273]: W0904 01:00:37.192590 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.193034 kubelet[3273]: E0904 01:00:37.192627 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.193203 kubelet[3273]: E0904 01:00:37.193172 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.193386 kubelet[3273]: W0904 01:00:37.193206 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.193386 kubelet[3273]: E0904 01:00:37.193242 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.229954 kubelet[3273]: E0904 01:00:37.229884 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.229954 kubelet[3273]: W0904 01:00:37.229936 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.230449 kubelet[3273]: E0904 01:00:37.229992 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.230631 kubelet[3273]: E0904 01:00:37.230573 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.230631 kubelet[3273]: W0904 01:00:37.230612 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.231012 kubelet[3273]: E0904 01:00:37.230651 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.231279 kubelet[3273]: E0904 01:00:37.231231 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.231279 kubelet[3273]: W0904 01:00:37.231264 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.231604 kubelet[3273]: E0904 01:00:37.231300 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.232033 kubelet[3273]: E0904 01:00:37.231978 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.232033 kubelet[3273]: W0904 01:00:37.232025 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.232266 kubelet[3273]: E0904 01:00:37.232060 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.232555 kubelet[3273]: E0904 01:00:37.232520 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.232555 kubelet[3273]: W0904 01:00:37.232548 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.232926 kubelet[3273]: E0904 01:00:37.232585 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.233139 kubelet[3273]: E0904 01:00:37.233094 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.233139 kubelet[3273]: W0904 01:00:37.233131 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.233378 kubelet[3273]: E0904 01:00:37.233163 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.233671 kubelet[3273]: E0904 01:00:37.233600 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.233671 kubelet[3273]: W0904 01:00:37.233626 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.233671 kubelet[3273]: E0904 01:00:37.233651 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.234117 kubelet[3273]: E0904 01:00:37.234067 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.234117 kubelet[3273]: W0904 01:00:37.234090 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.234117 kubelet[3273]: E0904 01:00:37.234114 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.234504 kubelet[3273]: E0904 01:00:37.234461 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.234504 kubelet[3273]: W0904 01:00:37.234486 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.234798 kubelet[3273]: E0904 01:00:37.234509 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.234923 kubelet[3273]: E0904 01:00:37.234863 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.234923 kubelet[3273]: W0904 01:00:37.234884 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.234923 kubelet[3273]: E0904 01:00:37.234908 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.235369 kubelet[3273]: E0904 01:00:37.235340 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.235369 kubelet[3273]: W0904 01:00:37.235365 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.235605 kubelet[3273]: E0904 01:00:37.235389 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.235931 kubelet[3273]: E0904 01:00:37.235900 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.236083 kubelet[3273]: W0904 01:00:37.235932 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.236083 kubelet[3273]: E0904 01:00:37.235958 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.236429 kubelet[3273]: E0904 01:00:37.236366 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.236429 kubelet[3273]: W0904 01:00:37.236389 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.236429 kubelet[3273]: E0904 01:00:37.236412 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.236965 kubelet[3273]: E0904 01:00:37.236732 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.236965 kubelet[3273]: W0904 01:00:37.236772 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.236965 kubelet[3273]: E0904 01:00:37.236806 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.237430 kubelet[3273]: E0904 01:00:37.237306 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.237430 kubelet[3273]: W0904 01:00:37.237329 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.237430 kubelet[3273]: E0904 01:00:37.237352 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.238096 kubelet[3273]: E0904 01:00:37.237920 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.238096 kubelet[3273]: W0904 01:00:37.237967 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.238096 kubelet[3273]: E0904 01:00:37.238018 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.238688 kubelet[3273]: E0904 01:00:37.238641 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.238688 kubelet[3273]: W0904 01:00:37.238685 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.239083 kubelet[3273]: E0904 01:00:37.238727 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:37.239832 kubelet[3273]: E0904 01:00:37.239785 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:37.239832 kubelet[3273]: W0904 01:00:37.239825 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:37.240100 kubelet[3273]: E0904 01:00:37.239865 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.099632 kubelet[3273]: E0904 01:00:38.099557 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.099632 kubelet[3273]: W0904 01:00:38.099610 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.100994 kubelet[3273]: E0904 01:00:38.099666 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.100994 kubelet[3273]: E0904 01:00:38.100258 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.100994 kubelet[3273]: W0904 01:00:38.100297 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.100994 kubelet[3273]: E0904 01:00:38.100343 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.100994 kubelet[3273]: E0904 01:00:38.100880 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.100994 kubelet[3273]: W0904 01:00:38.100915 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.100994 kubelet[3273]: E0904 01:00:38.100952 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.102045 kubelet[3273]: E0904 01:00:38.101479 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.102045 kubelet[3273]: W0904 01:00:38.101514 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.102045 kubelet[3273]: E0904 01:00:38.101551 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.102486 kubelet[3273]: E0904 01:00:38.102098 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.102486 kubelet[3273]: W0904 01:00:38.102129 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.102486 kubelet[3273]: E0904 01:00:38.102169 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.102974 kubelet[3273]: E0904 01:00:38.102660 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.102974 kubelet[3273]: W0904 01:00:38.102696 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.102974 kubelet[3273]: E0904 01:00:38.102735 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.103391 kubelet[3273]: E0904 01:00:38.103248 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.103391 kubelet[3273]: W0904 01:00:38.103282 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.103391 kubelet[3273]: E0904 01:00:38.103327 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.103911 kubelet[3273]: E0904 01:00:38.103865 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.103911 kubelet[3273]: W0904 01:00:38.103901 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.104259 kubelet[3273]: E0904 01:00:38.103937 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.104501 kubelet[3273]: E0904 01:00:38.104457 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.104501 kubelet[3273]: W0904 01:00:38.104490 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.104858 kubelet[3273]: E0904 01:00:38.104529 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.104954 kubelet[3273]: E0904 01:00:38.104946 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.104954 kubelet[3273]: W0904 01:00:38.104953 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105009 kubelet[3273]: E0904 01:00:38.104960 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.105096 kubelet[3273]: E0904 01:00:38.105089 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.105096 kubelet[3273]: W0904 01:00:38.105095 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105153 kubelet[3273]: E0904 01:00:38.105102 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.105242 kubelet[3273]: E0904 01:00:38.105235 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.105242 kubelet[3273]: W0904 01:00:38.105240 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105298 kubelet[3273]: E0904 01:00:38.105247 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.105360 kubelet[3273]: E0904 01:00:38.105354 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.105360 kubelet[3273]: W0904 01:00:38.105359 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105417 kubelet[3273]: E0904 01:00:38.105365 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.105454 kubelet[3273]: E0904 01:00:38.105446 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.105454 kubelet[3273]: W0904 01:00:38.105451 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105506 kubelet[3273]: E0904 01:00:38.105458 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.105541 kubelet[3273]: E0904 01:00:38.105533 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.105574 kubelet[3273]: W0904 01:00:38.105540 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.105574 kubelet[3273]: E0904 01:00:38.105547 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.137797 kubelet[3273]: E0904 01:00:38.137731 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.137797 kubelet[3273]: W0904 01:00:38.137746 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.137797 kubelet[3273]: E0904 01:00:38.137780 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138020 kubelet[3273]: E0904 01:00:38.137981 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138020 kubelet[3273]: W0904 01:00:38.137992 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138020 kubelet[3273]: E0904 01:00:38.138002 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138233 kubelet[3273]: E0904 01:00:38.138193 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138233 kubelet[3273]: W0904 01:00:38.138204 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138233 kubelet[3273]: E0904 01:00:38.138215 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138427 kubelet[3273]: E0904 01:00:38.138389 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138427 kubelet[3273]: W0904 01:00:38.138398 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138427 kubelet[3273]: E0904 01:00:38.138407 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138543 kubelet[3273]: E0904 01:00:38.138538 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138571 kubelet[3273]: W0904 01:00:38.138545 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138571 kubelet[3273]: E0904 01:00:38.138553 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138671 kubelet[3273]: E0904 01:00:38.138661 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138671 kubelet[3273]: W0904 01:00:38.138669 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138738 kubelet[3273]: E0904 01:00:38.138676 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.138817 kubelet[3273]: E0904 01:00:38.138809 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.138817 kubelet[3273]: W0904 01:00:38.138817 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.138871 kubelet[3273]: E0904 01:00:38.138825 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139082 kubelet[3273]: E0904 01:00:38.139069 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139121 kubelet[3273]: W0904 01:00:38.139084 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139121 kubelet[3273]: E0904 01:00:38.139095 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139242 kubelet[3273]: E0904 01:00:38.139233 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139242 kubelet[3273]: W0904 01:00:38.139241 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139295 kubelet[3273]: E0904 01:00:38.139249 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139396 kubelet[3273]: E0904 01:00:38.139389 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139429 kubelet[3273]: W0904 01:00:38.139396 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139429 kubelet[3273]: E0904 01:00:38.139404 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139523 kubelet[3273]: E0904 01:00:38.139515 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139523 kubelet[3273]: W0904 01:00:38.139522 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139581 kubelet[3273]: E0904 01:00:38.139529 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139650 kubelet[3273]: E0904 01:00:38.139643 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139685 kubelet[3273]: W0904 01:00:38.139650 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139685 kubelet[3273]: E0904 01:00:38.139658 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.139795 kubelet[3273]: E0904 01:00:38.139787 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.139795 kubelet[3273]: W0904 01:00:38.139794 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.139856 kubelet[3273]: E0904 01:00:38.139802 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.140087 kubelet[3273]: E0904 01:00:38.140074 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.140122 kubelet[3273]: W0904 01:00:38.140088 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.140122 kubelet[3273]: E0904 01:00:38.140099 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.140243 kubelet[3273]: E0904 01:00:38.140213 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.140243 kubelet[3273]: W0904 01:00:38.140221 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.140243 kubelet[3273]: E0904 01:00:38.140229 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.140404 kubelet[3273]: E0904 01:00:38.140396 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.140438 kubelet[3273]: W0904 01:00:38.140404 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.140438 kubelet[3273]: E0904 01:00:38.140412 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.140593 kubelet[3273]: E0904 01:00:38.140585 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.140625 kubelet[3273]: W0904 01:00:38.140593 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.140625 kubelet[3273]: E0904 01:00:38.140600 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.140728 kubelet[3273]: E0904 01:00:38.140720 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:38.140765 kubelet[3273]: W0904 01:00:38.140728 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:38.140765 kubelet[3273]: E0904 01:00:38.140735 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:38.983105 kubelet[3273]: E0904 01:00:38.982968 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:39.111209 kubelet[3273]: E0904 01:00:39.111128 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.111209 kubelet[3273]: W0904 01:00:39.111189 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.112447 kubelet[3273]: E0904 01:00:39.111245 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.112447 kubelet[3273]: E0904 01:00:39.111849 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.112447 kubelet[3273]: W0904 01:00:39.111887 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.112447 kubelet[3273]: E0904 01:00:39.111937 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.112447 kubelet[3273]: E0904 01:00:39.112448 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.113280 kubelet[3273]: W0904 01:00:39.112479 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.113280 kubelet[3273]: E0904 01:00:39.112517 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.113280 kubelet[3273]: E0904 01:00:39.113189 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.113280 kubelet[3273]: W0904 01:00:39.113223 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.113280 kubelet[3273]: E0904 01:00:39.113262 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.114153 kubelet[3273]: E0904 01:00:39.113801 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.114153 kubelet[3273]: W0904 01:00:39.113834 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.114153 kubelet[3273]: E0904 01:00:39.113868 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.114601 kubelet[3273]: E0904 01:00:39.114382 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.114601 kubelet[3273]: W0904 01:00:39.114416 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.114601 kubelet[3273]: E0904 01:00:39.114453 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.115027 kubelet[3273]: E0904 01:00:39.114977 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.115027 kubelet[3273]: W0904 01:00:39.115012 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.115335 kubelet[3273]: E0904 01:00:39.115051 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.115605 kubelet[3273]: E0904 01:00:39.115561 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.115605 kubelet[3273]: W0904 01:00:39.115592 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.115962 kubelet[3273]: E0904 01:00:39.115631 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.116200 kubelet[3273]: E0904 01:00:39.116156 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.116200 kubelet[3273]: W0904 01:00:39.116189 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.116527 kubelet[3273]: E0904 01:00:39.116228 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.116780 kubelet[3273]: E0904 01:00:39.116711 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.116780 kubelet[3273]: W0904 01:00:39.116747 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.117106 kubelet[3273]: E0904 01:00:39.116807 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.117338 kubelet[3273]: E0904 01:00:39.117295 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.117338 kubelet[3273]: W0904 01:00:39.117326 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.117656 kubelet[3273]: E0904 01:00:39.117361 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.117889 kubelet[3273]: E0904 01:00:39.117852 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.117889 kubelet[3273]: W0904 01:00:39.117886 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.118217 kubelet[3273]: E0904 01:00:39.117923 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.118455 kubelet[3273]: E0904 01:00:39.118418 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.118455 kubelet[3273]: W0904 01:00:39.118450 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.118805 kubelet[3273]: E0904 01:00:39.118493 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.119057 kubelet[3273]: E0904 01:00:39.119020 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.119057 kubelet[3273]: W0904 01:00:39.119052 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.119367 kubelet[3273]: E0904 01:00:39.119091 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.119625 kubelet[3273]: E0904 01:00:39.119586 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.119625 kubelet[3273]: W0904 01:00:39.119617 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.119968 kubelet[3273]: E0904 01:00:39.119653 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.148771 kubelet[3273]: E0904 01:00:39.148667 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.148771 kubelet[3273]: W0904 01:00:39.148709 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.148771 kubelet[3273]: E0904 01:00:39.148746 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.149370 kubelet[3273]: E0904 01:00:39.149292 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.149370 kubelet[3273]: W0904 01:00:39.149319 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.149370 kubelet[3273]: E0904 01:00:39.149346 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.149880 kubelet[3273]: E0904 01:00:39.149803 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.149880 kubelet[3273]: W0904 01:00:39.149826 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.149880 kubelet[3273]: E0904 01:00:39.149850 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.150469 kubelet[3273]: E0904 01:00:39.150403 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.150469 kubelet[3273]: W0904 01:00:39.150445 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.150469 kubelet[3273]: E0904 01:00:39.150479 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.151078 kubelet[3273]: E0904 01:00:39.151040 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.151078 kubelet[3273]: W0904 01:00:39.151071 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.151317 kubelet[3273]: E0904 01:00:39.151103 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.151627 kubelet[3273]: E0904 01:00:39.151586 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.151627 kubelet[3273]: W0904 01:00:39.151614 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.151903 kubelet[3273]: E0904 01:00:39.151642 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.152143 kubelet[3273]: E0904 01:00:39.152109 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.152143 kubelet[3273]: W0904 01:00:39.152133 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.152382 kubelet[3273]: E0904 01:00:39.152157 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.152580 kubelet[3273]: E0904 01:00:39.152554 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.152580 kubelet[3273]: W0904 01:00:39.152577 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.152873 kubelet[3273]: E0904 01:00:39.152602 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.153107 kubelet[3273]: E0904 01:00:39.153073 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.153212 kubelet[3273]: W0904 01:00:39.153110 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.153212 kubelet[3273]: E0904 01:00:39.153149 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.153620 kubelet[3273]: E0904 01:00:39.153592 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.153620 kubelet[3273]: W0904 01:00:39.153619 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.153907 kubelet[3273]: E0904 01:00:39.153646 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.154146 kubelet[3273]: E0904 01:00:39.154114 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.154146 kubelet[3273]: W0904 01:00:39.154142 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.154393 kubelet[3273]: E0904 01:00:39.154168 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.154532 kubelet[3273]: E0904 01:00:39.154504 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.154532 kubelet[3273]: W0904 01:00:39.154529 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.154861 kubelet[3273]: E0904 01:00:39.154552 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.155060 kubelet[3273]: E0904 01:00:39.155029 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.155060 kubelet[3273]: W0904 01:00:39.155056 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.155336 kubelet[3273]: E0904 01:00:39.155083 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.155778 kubelet[3273]: E0904 01:00:39.155711 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.156075 kubelet[3273]: W0904 01:00:39.155773 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.156075 kubelet[3273]: E0904 01:00:39.155824 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.156502 kubelet[3273]: E0904 01:00:39.156462 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.156502 kubelet[3273]: W0904 01:00:39.156496 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.156846 kubelet[3273]: E0904 01:00:39.156539 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.157096 kubelet[3273]: E0904 01:00:39.157056 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.157096 kubelet[3273]: W0904 01:00:39.157090 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.157423 kubelet[3273]: E0904 01:00:39.157129 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.157650 kubelet[3273]: E0904 01:00:39.157613 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.157650 kubelet[3273]: W0904 01:00:39.157647 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.157988 kubelet[3273]: E0904 01:00:39.157685 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:39.158228 kubelet[3273]: E0904 01:00:39.158189 3273 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 01:00:39.158228 kubelet[3273]: W0904 01:00:39.158221 3273 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 01:00:39.158467 kubelet[3273]: E0904 01:00:39.158259 3273 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 01:00:40.983305 kubelet[3273]: E0904 01:00:40.983172 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:42.982651 kubelet[3273]: E0904 01:00:42.982552 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:44.982848 kubelet[3273]: E0904 01:00:44.982733 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:45.420090 containerd[1911]: time="2025-09-04T01:00:45.420003894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:45.420324 containerd[1911]: time="2025-09-04T01:00:45.420171825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 01:00:45.420620 containerd[1911]: time="2025-09-04T01:00:45.420584200Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:45.421449 containerd[1911]: time="2025-09-04T01:00:45.421435461Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:45.422085 containerd[1911]: time="2025-09-04T01:00:45.422072457Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 9.148999385s" Sep 4 01:00:45.422121 containerd[1911]: time="2025-09-04T01:00:45.422087792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 01:00:45.423744 containerd[1911]: time="2025-09-04T01:00:45.423731382Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 01:00:45.426805 containerd[1911]: time="2025-09-04T01:00:45.426785604Z" level=info msg="Container 421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:45.429928 containerd[1911]: time="2025-09-04T01:00:45.429915730Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\"" Sep 4 01:00:45.430182 containerd[1911]: time="2025-09-04T01:00:45.430134771Z" level=info msg="StartContainer for \"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\"" Sep 4 01:00:45.430874 containerd[1911]: time="2025-09-04T01:00:45.430827776Z" level=info msg="connecting to shim 421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314" address="unix:///run/containerd/s/90cda3c5fef613ada18aa40b19f0d20894165878a4172eef483fe0625d47fb2d" protocol=ttrpc version=3 Sep 4 01:00:45.451249 systemd[1]: Started cri-containerd-421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314.scope - libcontainer container 421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314. Sep 4 01:00:45.510473 containerd[1911]: time="2025-09-04T01:00:45.510436116Z" level=info msg="StartContainer for \"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\" returns successfully" Sep 4 01:00:45.517390 systemd[1]: cri-containerd-421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314.scope: Deactivated successfully. Sep 4 01:00:45.518990 containerd[1911]: time="2025-09-04T01:00:45.518955914Z" level=info msg="received exit event container_id:\"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\" id:\"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\" pid:4231 exited_at:{seconds:1756947645 nanos:518718900}" Sep 4 01:00:45.519068 containerd[1911]: time="2025-09-04T01:00:45.518998979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\" id:\"421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314\" pid:4231 exited_at:{seconds:1756947645 nanos:518718900}" Sep 4 01:00:45.537171 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314-rootfs.mount: Deactivated successfully. Sep 4 01:00:46.118064 containerd[1911]: time="2025-09-04T01:00:46.117981676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 01:00:46.982370 kubelet[3273]: E0904 01:00:46.982238 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:48.982676 kubelet[3273]: E0904 01:00:48.982539 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:50.987542 kubelet[3273]: E0904 01:00:50.987427 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:52.983237 kubelet[3273]: E0904 01:00:52.983171 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:53.713944 containerd[1911]: time="2025-09-04T01:00:53.713887105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:53.714153 containerd[1911]: time="2025-09-04T01:00:53.714082569Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 01:00:53.714418 containerd[1911]: time="2025-09-04T01:00:53.714378545Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:53.715245 containerd[1911]: time="2025-09-04T01:00:53.715206811Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:00:53.715586 containerd[1911]: time="2025-09-04T01:00:53.715543698Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 7.597490443s" Sep 4 01:00:53.715586 containerd[1911]: time="2025-09-04T01:00:53.715559770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 01:00:53.716967 containerd[1911]: time="2025-09-04T01:00:53.716927984Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 01:00:53.720262 containerd[1911]: time="2025-09-04T01:00:53.720223014Z" level=info msg="Container 66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:00:53.723903 containerd[1911]: time="2025-09-04T01:00:53.723861958Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\"" Sep 4 01:00:53.724060 containerd[1911]: time="2025-09-04T01:00:53.724048172Z" level=info msg="StartContainer for \"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\"" Sep 4 01:00:53.724838 containerd[1911]: time="2025-09-04T01:00:53.724791058Z" level=info msg="connecting to shim 66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989" address="unix:///run/containerd/s/90cda3c5fef613ada18aa40b19f0d20894165878a4172eef483fe0625d47fb2d" protocol=ttrpc version=3 Sep 4 01:00:53.749029 systemd[1]: Started cri-containerd-66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989.scope - libcontainer container 66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989. Sep 4 01:00:53.770325 containerd[1911]: time="2025-09-04T01:00:53.770303459Z" level=info msg="StartContainer for \"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\" returns successfully" Sep 4 01:00:54.334002 systemd[1]: cri-containerd-66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989.scope: Deactivated successfully. Sep 4 01:00:54.334161 systemd[1]: cri-containerd-66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989.scope: Consumed 353ms CPU time, 196.1M memory peak, 171.3M written to disk. Sep 4 01:00:54.334638 containerd[1911]: time="2025-09-04T01:00:54.334619156Z" level=info msg="received exit event container_id:\"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\" id:\"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\" pid:4292 exited_at:{seconds:1756947654 nanos:334522046}" Sep 4 01:00:54.334681 containerd[1911]: time="2025-09-04T01:00:54.334671054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\" id:\"66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989\" pid:4292 exited_at:{seconds:1756947654 nanos:334522046}" Sep 4 01:00:54.344435 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989-rootfs.mount: Deactivated successfully. Sep 4 01:00:54.379552 kubelet[3273]: I0904 01:00:54.379495 3273 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 4 01:00:54.438541 systemd[1]: Created slice kubepods-besteffort-poda5cd7763_0c95_480d_bd8f_2abfda1f515e.slice - libcontainer container kubepods-besteffort-poda5cd7763_0c95_480d_bd8f_2abfda1f515e.slice. Sep 4 01:00:54.473090 systemd[1]: Created slice kubepods-burstable-pod8381ffae_bf3c_40d3_9391_b9f0872fdb03.slice - libcontainer container kubepods-burstable-pod8381ffae_bf3c_40d3_9391_b9f0872fdb03.slice. Sep 4 01:00:54.489904 systemd[1]: Created slice kubepods-besteffort-pod729b509f_0f97_4ec9_afbb_7f6a547a08ac.slice - libcontainer container kubepods-besteffort-pod729b509f_0f97_4ec9_afbb_7f6a547a08ac.slice. Sep 4 01:00:54.517764 systemd[1]: Created slice kubepods-burstable-pod1719e2f2_7b21_4a85_b131_359f01b6a3ae.slice - libcontainer container kubepods-burstable-pod1719e2f2_7b21_4a85_b131_359f01b6a3ae.slice. Sep 4 01:00:54.555728 systemd[1]: Created slice kubepods-besteffort-pod8b334fed_4398_41d5_a0d5_cb7e7c6b3608.slice - libcontainer container kubepods-besteffort-pod8b334fed_4398_41d5_a0d5_cb7e7c6b3608.slice. Sep 4 01:00:54.577240 kubelet[3273]: I0904 01:00:54.577116 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85vv\" (UniqueName: \"kubernetes.io/projected/8381ffae-bf3c-40d3-9391-b9f0872fdb03-kube-api-access-j85vv\") pod \"coredns-674b8bbfcf-xc5w2\" (UID: \"8381ffae-bf3c-40d3-9391-b9f0872fdb03\") " pod="kube-system/coredns-674b8bbfcf-xc5w2" Sep 4 01:00:54.577240 kubelet[3273]: I0904 01:00:54.577210 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjztw\" (UniqueName: \"kubernetes.io/projected/a5cd7763-0c95-480d-bd8f-2abfda1f515e-kube-api-access-qjztw\") pod \"calico-kube-controllers-ff5f6d785-tbvw4\" (UID: \"a5cd7763-0c95-480d-bd8f-2abfda1f515e\") " pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" Sep 4 01:00:54.600654 kubelet[3273]: I0904 01:00:54.577272 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8b5p\" (UniqueName: \"kubernetes.io/projected/729b509f-0f97-4ec9-afbb-7f6a547a08ac-kube-api-access-m8b5p\") pod \"calico-apiserver-6b6dc69f69-skh28\" (UID: \"729b509f-0f97-4ec9-afbb-7f6a547a08ac\") " pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" Sep 4 01:00:54.600654 kubelet[3273]: I0904 01:00:54.577330 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54td\" (UniqueName: \"kubernetes.io/projected/1719e2f2-7b21-4a85-b131-359f01b6a3ae-kube-api-access-v54td\") pod \"coredns-674b8bbfcf-zkzwt\" (UID: \"1719e2f2-7b21-4a85-b131-359f01b6a3ae\") " pod="kube-system/coredns-674b8bbfcf-zkzwt" Sep 4 01:00:54.600654 kubelet[3273]: I0904 01:00:54.577444 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5cd7763-0c95-480d-bd8f-2abfda1f515e-tigera-ca-bundle\") pod \"calico-kube-controllers-ff5f6d785-tbvw4\" (UID: \"a5cd7763-0c95-480d-bd8f-2abfda1f515e\") " pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" Sep 4 01:00:54.600654 kubelet[3273]: I0904 01:00:54.577493 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1719e2f2-7b21-4a85-b131-359f01b6a3ae-config-volume\") pod \"coredns-674b8bbfcf-zkzwt\" (UID: \"1719e2f2-7b21-4a85-b131-359f01b6a3ae\") " pod="kube-system/coredns-674b8bbfcf-zkzwt" Sep 4 01:00:54.600654 kubelet[3273]: I0904 01:00:54.577552 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8381ffae-bf3c-40d3-9391-b9f0872fdb03-config-volume\") pod \"coredns-674b8bbfcf-xc5w2\" (UID: \"8381ffae-bf3c-40d3-9391-b9f0872fdb03\") " pod="kube-system/coredns-674b8bbfcf-xc5w2" Sep 4 01:00:54.600885 kubelet[3273]: I0904 01:00:54.577627 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/729b509f-0f97-4ec9-afbb-7f6a547a08ac-calico-apiserver-certs\") pod \"calico-apiserver-6b6dc69f69-skh28\" (UID: \"729b509f-0f97-4ec9-afbb-7f6a547a08ac\") " pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" Sep 4 01:00:54.606981 systemd[1]: Created slice kubepods-besteffort-pod7fed3011_6531_4d2d_a58f_628d8600da48.slice - libcontainer container kubepods-besteffort-pod7fed3011_6531_4d2d_a58f_628d8600da48.slice. Sep 4 01:00:54.678647 kubelet[3273]: I0904 01:00:54.678561 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fed3011-6531-4d2d-a58f-628d8600da48-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-t2rrl\" (UID: \"7fed3011-6531-4d2d-a58f-628d8600da48\") " pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.679047 kubelet[3273]: I0904 01:00:54.678666 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fed3011-6531-4d2d-a58f-628d8600da48-config\") pod \"goldmane-54d579b49d-t2rrl\" (UID: \"7fed3011-6531-4d2d-a58f-628d8600da48\") " pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.679047 kubelet[3273]: I0904 01:00:54.678845 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8st8f\" (UniqueName: \"kubernetes.io/projected/7fed3011-6531-4d2d-a58f-628d8600da48-kube-api-access-8st8f\") pod \"goldmane-54d579b49d-t2rrl\" (UID: \"7fed3011-6531-4d2d-a58f-628d8600da48\") " pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.679047 kubelet[3273]: I0904 01:00:54.678976 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8b334fed-4398-41d5-a0d5-cb7e7c6b3608-calico-apiserver-certs\") pod \"calico-apiserver-6b6dc69f69-7r7n6\" (UID: \"8b334fed-4398-41d5-a0d5-cb7e7c6b3608\") " pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" Sep 4 01:00:54.679137 kubelet[3273]: I0904 01:00:54.679059 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h89vz\" (UniqueName: \"kubernetes.io/projected/8b334fed-4398-41d5-a0d5-cb7e7c6b3608-kube-api-access-h89vz\") pod \"calico-apiserver-6b6dc69f69-7r7n6\" (UID: \"8b334fed-4398-41d5-a0d5-cb7e7c6b3608\") " pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" Sep 4 01:00:54.679137 kubelet[3273]: I0904 01:00:54.679104 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7fed3011-6531-4d2d-a58f-628d8600da48-goldmane-key-pair\") pod \"goldmane-54d579b49d-t2rrl\" (UID: \"7fed3011-6531-4d2d-a58f-628d8600da48\") " pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.719401 systemd[1]: Created slice kubepods-besteffort-pod76666ee7_2b21_4bd8_a85d_bb73cd4c2ad7.slice - libcontainer container kubepods-besteffort-pod76666ee7_2b21_4bd8_a85d_bb73cd4c2ad7.slice. Sep 4 01:00:54.771857 containerd[1911]: time="2025-09-04T01:00:54.771741964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff5f6d785-tbvw4,Uid:a5cd7763-0c95-480d-bd8f-2abfda1f515e,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:54.779322 kubelet[3273]: I0904 01:00:54.779306 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-ca-bundle\") pod \"whisker-7dbf68dc4b-km665\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " pod="calico-system/whisker-7dbf68dc4b-km665" Sep 4 01:00:54.779369 kubelet[3273]: I0904 01:00:54.779341 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4hlx\" (UniqueName: \"kubernetes.io/projected/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-kube-api-access-l4hlx\") pod \"whisker-7dbf68dc4b-km665\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " pod="calico-system/whisker-7dbf68dc4b-km665" Sep 4 01:00:54.779369 kubelet[3273]: I0904 01:00:54.779353 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-backend-key-pair\") pod \"whisker-7dbf68dc4b-km665\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " pod="calico-system/whisker-7dbf68dc4b-km665" Sep 4 01:00:54.786490 containerd[1911]: time="2025-09-04T01:00:54.786453263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xc5w2,Uid:8381ffae-bf3c-40d3-9391-b9f0872fdb03,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:54.796070 containerd[1911]: time="2025-09-04T01:00:54.796016670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-skh28,Uid:729b509f-0f97-4ec9-afbb-7f6a547a08ac,Namespace:calico-apiserver,Attempt:0,}" Sep 4 01:00:54.818407 containerd[1911]: time="2025-09-04T01:00:54.818344582Z" level=error msg="Failed to destroy network for sandbox \"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.818944 containerd[1911]: time="2025-09-04T01:00:54.818922395Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff5f6d785-tbvw4,Uid:a5cd7763-0c95-480d-bd8f-2abfda1f515e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.819079 containerd[1911]: time="2025-09-04T01:00:54.819054098Z" level=error msg="Failed to destroy network for sandbox \"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.819118 kubelet[3273]: E0904 01:00:54.819061 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.819159 kubelet[3273]: E0904 01:00:54.819117 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" Sep 4 01:00:54.819159 kubelet[3273]: E0904 01:00:54.819137 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" Sep 4 01:00:54.819220 kubelet[3273]: E0904 01:00:54.819180 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-ff5f6d785-tbvw4_calico-system(a5cd7763-0c95-480d-bd8f-2abfda1f515e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-ff5f6d785-tbvw4_calico-system(a5cd7763-0c95-480d-bd8f-2abfda1f515e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"061ccd3423c7e077a699dfe89950611fc57583653a819cc493842bb86edd96dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" podUID="a5cd7763-0c95-480d-bd8f-2abfda1f515e" Sep 4 01:00:54.819410 containerd[1911]: time="2025-09-04T01:00:54.819393831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zkzwt,Uid:1719e2f2-7b21-4a85-b131-359f01b6a3ae,Namespace:kube-system,Attempt:0,}" Sep 4 01:00:54.819440 containerd[1911]: time="2025-09-04T01:00:54.819410649Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xc5w2,Uid:8381ffae-bf3c-40d3-9391-b9f0872fdb03,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.819503 kubelet[3273]: E0904 01:00:54.819484 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.819526 kubelet[3273]: E0904 01:00:54.819515 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xc5w2" Sep 4 01:00:54.819547 kubelet[3273]: E0904 01:00:54.819532 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xc5w2" Sep 4 01:00:54.819570 kubelet[3273]: E0904 01:00:54.819559 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-xc5w2_kube-system(8381ffae-bf3c-40d3-9391-b9f0872fdb03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-xc5w2_kube-system(8381ffae-bf3c-40d3-9391-b9f0872fdb03)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff627490ee31bee5799be23aaca956d3df4bd46b03d4a66604a9898a6f852536\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-xc5w2" podUID="8381ffae-bf3c-40d3-9391-b9f0872fdb03" Sep 4 01:00:54.821014 containerd[1911]: time="2025-09-04T01:00:54.820973733Z" level=error msg="Failed to destroy network for sandbox \"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.821448 containerd[1911]: time="2025-09-04T01:00:54.821407866Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-skh28,Uid:729b509f-0f97-4ec9-afbb-7f6a547a08ac,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.821512 kubelet[3273]: E0904 01:00:54.821500 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.821542 kubelet[3273]: E0904 01:00:54.821523 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" Sep 4 01:00:54.821542 kubelet[3273]: E0904 01:00:54.821535 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" Sep 4 01:00:54.821591 kubelet[3273]: E0904 01:00:54.821568 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b6dc69f69-skh28_calico-apiserver(729b509f-0f97-4ec9-afbb-7f6a547a08ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b6dc69f69-skh28_calico-apiserver(729b509f-0f97-4ec9-afbb-7f6a547a08ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a37c219bf7eb7122daa557553fd887f99b7dfd4f2e15a81f3ebcbd5ca27cbb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" podUID="729b509f-0f97-4ec9-afbb-7f6a547a08ac" Sep 4 01:00:54.843965 containerd[1911]: time="2025-09-04T01:00:54.843939814Z" level=error msg="Failed to destroy network for sandbox \"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.844619 containerd[1911]: time="2025-09-04T01:00:54.844602624Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zkzwt,Uid:1719e2f2-7b21-4a85-b131-359f01b6a3ae,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.844817 kubelet[3273]: E0904 01:00:54.844763 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.844817 kubelet[3273]: E0904 01:00:54.844803 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zkzwt" Sep 4 01:00:54.844817 kubelet[3273]: E0904 01:00:54.844816 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zkzwt" Sep 4 01:00:54.844896 kubelet[3273]: E0904 01:00:54.844850 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zkzwt_kube-system(1719e2f2-7b21-4a85-b131-359f01b6a3ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zkzwt_kube-system(1719e2f2-7b21-4a85-b131-359f01b6a3ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8ca766ae1030a799b454da4c0c0f795cbe15f908736c4de258aa5657b53af9c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zkzwt" podUID="1719e2f2-7b21-4a85-b131-359f01b6a3ae" Sep 4 01:00:54.860447 containerd[1911]: time="2025-09-04T01:00:54.860228353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-7r7n6,Uid:8b334fed-4398-41d5-a0d5-cb7e7c6b3608,Namespace:calico-apiserver,Attempt:0,}" Sep 4 01:00:54.884353 containerd[1911]: time="2025-09-04T01:00:54.884325711Z" level=error msg="Failed to destroy network for sandbox \"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.884722 containerd[1911]: time="2025-09-04T01:00:54.884708616Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-7r7n6,Uid:8b334fed-4398-41d5-a0d5-cb7e7c6b3608,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.884836 kubelet[3273]: E0904 01:00:54.884816 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.884865 kubelet[3273]: E0904 01:00:54.884853 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" Sep 4 01:00:54.884888 kubelet[3273]: E0904 01:00:54.884872 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" Sep 4 01:00:54.884930 kubelet[3273]: E0904 01:00:54.884912 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b6dc69f69-7r7n6_calico-apiserver(8b334fed-4398-41d5-a0d5-cb7e7c6b3608)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b6dc69f69-7r7n6_calico-apiserver(8b334fed-4398-41d5-a0d5-cb7e7c6b3608)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e70e11e5ba5406349ca510bf0061cd291acda868617de8d0a669d8e234a5aac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" podUID="8b334fed-4398-41d5-a0d5-cb7e7c6b3608" Sep 4 01:00:54.911319 containerd[1911]: time="2025-09-04T01:00:54.911208904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-t2rrl,Uid:7fed3011-6531-4d2d-a58f-628d8600da48,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:54.936594 containerd[1911]: time="2025-09-04T01:00:54.936537288Z" level=error msg="Failed to destroy network for sandbox \"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.937093 containerd[1911]: time="2025-09-04T01:00:54.937028755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-t2rrl,Uid:7fed3011-6531-4d2d-a58f-628d8600da48,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.937227 kubelet[3273]: E0904 01:00:54.937207 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:54.937261 kubelet[3273]: E0904 01:00:54.937241 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.937261 kubelet[3273]: E0904 01:00:54.937254 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-t2rrl" Sep 4 01:00:54.937302 kubelet[3273]: E0904 01:00:54.937286 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-t2rrl_calico-system(7fed3011-6531-4d2d-a58f-628d8600da48)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-t2rrl_calico-system(7fed3011-6531-4d2d-a58f-628d8600da48)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c03997249e4f2f059f181b1e0d8940bc3364d60f3b1f6e99b8f670832fee3720\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-t2rrl" podUID="7fed3011-6531-4d2d-a58f-628d8600da48" Sep 4 01:00:55.003699 systemd[1]: Created slice kubepods-besteffort-pod427476d1_8991_4f95_a430_df7e5fb9ed65.slice - libcontainer container kubepods-besteffort-pod427476d1_8991_4f95_a430_df7e5fb9ed65.slice. Sep 4 01:00:55.009735 containerd[1911]: time="2025-09-04T01:00:55.009664200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89qxc,Uid:427476d1-8991-4f95-a430-df7e5fb9ed65,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:55.020964 containerd[1911]: time="2025-09-04T01:00:55.020912555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dbf68dc4b-km665,Uid:76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7,Namespace:calico-system,Attempt:0,}" Sep 4 01:00:55.032474 containerd[1911]: time="2025-09-04T01:00:55.032445231Z" level=error msg="Failed to destroy network for sandbox \"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.032923 containerd[1911]: time="2025-09-04T01:00:55.032897475Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89qxc,Uid:427476d1-8991-4f95-a430-df7e5fb9ed65,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.033065 kubelet[3273]: E0904 01:00:55.033041 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.033111 kubelet[3273]: E0904 01:00:55.033080 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:55.033111 kubelet[3273]: E0904 01:00:55.033098 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-89qxc" Sep 4 01:00:55.033174 kubelet[3273]: E0904 01:00:55.033132 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-89qxc_calico-system(427476d1-8991-4f95-a430-df7e5fb9ed65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-89qxc_calico-system(427476d1-8991-4f95-a430-df7e5fb9ed65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a4c45bc34dd566b5668278228eb349c1d20a54c16206632ee54bba848812d1e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-89qxc" podUID="427476d1-8991-4f95-a430-df7e5fb9ed65" Sep 4 01:00:55.043205 containerd[1911]: time="2025-09-04T01:00:55.043177838Z" level=error msg="Failed to destroy network for sandbox \"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.043741 containerd[1911]: time="2025-09-04T01:00:55.043723980Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dbf68dc4b-km665,Uid:76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.043910 kubelet[3273]: E0904 01:00:55.043888 3273 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 01:00:55.043946 kubelet[3273]: E0904 01:00:55.043924 3273 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7dbf68dc4b-km665" Sep 4 01:00:55.043946 kubelet[3273]: E0904 01:00:55.043937 3273 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7dbf68dc4b-km665" Sep 4 01:00:55.043995 kubelet[3273]: E0904 01:00:55.043969 3273 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7dbf68dc4b-km665_calico-system(76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7dbf68dc4b-km665_calico-system(76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd9d195ef903cb972c881a6cfb7aa811392ddf9b5c7bba7420789c1e7647f7db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7dbf68dc4b-km665" podUID="76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7" Sep 4 01:00:55.154193 containerd[1911]: time="2025-09-04T01:00:55.153999307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 01:00:55.788106 systemd[1]: run-netns-cni\x2d401f672c\x2d5164\x2d97fc\x2d268c\x2d267026cc47db.mount: Deactivated successfully. Sep 4 01:01:03.270098 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount888638312.mount: Deactivated successfully. Sep 4 01:01:03.286503 containerd[1911]: time="2025-09-04T01:01:03.286450561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:03.286674 containerd[1911]: time="2025-09-04T01:01:03.286552015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 01:01:03.287171 containerd[1911]: time="2025-09-04T01:01:03.287159416Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:03.287857 containerd[1911]: time="2025-09-04T01:01:03.287841845Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:03.288150 containerd[1911]: time="2025-09-04T01:01:03.288139246Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 8.134062247s" Sep 4 01:01:03.288184 containerd[1911]: time="2025-09-04T01:01:03.288153629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 01:01:03.291982 containerd[1911]: time="2025-09-04T01:01:03.291963348Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 01:01:03.295355 containerd[1911]: time="2025-09-04T01:01:03.295314594Z" level=info msg="Container 7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:03.299205 containerd[1911]: time="2025-09-04T01:01:03.299167559Z" level=info msg="CreateContainer within sandbox \"79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\"" Sep 4 01:01:03.299446 containerd[1911]: time="2025-09-04T01:01:03.299397603Z" level=info msg="StartContainer for \"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\"" Sep 4 01:01:03.300176 containerd[1911]: time="2025-09-04T01:01:03.300164297Z" level=info msg="connecting to shim 7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7" address="unix:///run/containerd/s/90cda3c5fef613ada18aa40b19f0d20894165878a4172eef483fe0625d47fb2d" protocol=ttrpc version=3 Sep 4 01:01:03.324046 systemd[1]: Started cri-containerd-7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7.scope - libcontainer container 7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7. Sep 4 01:01:03.380129 containerd[1911]: time="2025-09-04T01:01:03.380063847Z" level=info msg="StartContainer for \"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" returns successfully" Sep 4 01:01:03.460660 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 01:01:03.460713 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 01:01:03.650605 kubelet[3273]: I0904 01:01:03.650378 3273 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-backend-key-pair\") pod \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " Sep 4 01:01:03.650605 kubelet[3273]: I0904 01:01:03.650513 3273 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4hlx\" (UniqueName: \"kubernetes.io/projected/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-kube-api-access-l4hlx\") pod \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " Sep 4 01:01:03.651590 kubelet[3273]: I0904 01:01:03.650633 3273 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-ca-bundle\") pod \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\" (UID: \"76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7\") " Sep 4 01:01:03.651699 kubelet[3273]: I0904 01:01:03.651597 3273 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7" (UID: "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 4 01:01:03.656325 kubelet[3273]: I0904 01:01:03.656243 3273 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7" (UID: "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 4 01:01:03.656538 kubelet[3273]: I0904 01:01:03.656375 3273 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-kube-api-access-l4hlx" (OuterVolumeSpecName: "kube-api-access-l4hlx") pod "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7" (UID: "76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7"). InnerVolumeSpecName "kube-api-access-l4hlx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 4 01:01:03.752245 kubelet[3273]: I0904 01:01:03.752138 3273 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-backend-key-pair\") on node \"ci-4372.1.0-n-d84d506c1c\" DevicePath \"\"" Sep 4 01:01:03.752245 kubelet[3273]: I0904 01:01:03.752202 3273 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4hlx\" (UniqueName: \"kubernetes.io/projected/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-kube-api-access-l4hlx\") on node \"ci-4372.1.0-n-d84d506c1c\" DevicePath \"\"" Sep 4 01:01:03.752245 kubelet[3273]: I0904 01:01:03.752231 3273 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7-whisker-ca-bundle\") on node \"ci-4372.1.0-n-d84d506c1c\" DevicePath \"\"" Sep 4 01:01:04.182493 systemd[1]: Removed slice kubepods-besteffort-pod76666ee7_2b21_4bd8_a85d_bb73cd4c2ad7.slice - libcontainer container kubepods-besteffort-pod76666ee7_2b21_4bd8_a85d_bb73cd4c2ad7.slice. Sep 4 01:01:04.192106 kubelet[3273]: I0904 01:01:04.191981 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5wqj7" podStartSLOduration=1.828753546 podStartE2EDuration="41.191946601s" podCreationTimestamp="2025-09-04 01:00:23 +0000 UTC" firstStartedPulling="2025-09-04 01:00:23.925262984 +0000 UTC m=+19.019136884" lastFinishedPulling="2025-09-04 01:01:03.288456039 +0000 UTC m=+58.382329939" observedRunningTime="2025-09-04 01:01:04.191203259 +0000 UTC m=+59.285077276" watchObservedRunningTime="2025-09-04 01:01:04.191946601 +0000 UTC m=+59.285820542" Sep 4 01:01:04.220094 systemd[1]: Created slice kubepods-besteffort-pod88673c6f_d652_4946_bae4_4808846ac0b8.slice - libcontainer container kubepods-besteffort-pod88673c6f_d652_4946_bae4_4808846ac0b8.slice. Sep 4 01:01:04.274459 systemd[1]: var-lib-kubelet-pods-76666ee7\x2d2b21\x2d4bd8\x2da85d\x2dbb73cd4c2ad7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dl4hlx.mount: Deactivated successfully. Sep 4 01:01:04.274515 systemd[1]: var-lib-kubelet-pods-76666ee7\x2d2b21\x2d4bd8\x2da85d\x2dbb73cd4c2ad7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 01:01:04.356737 kubelet[3273]: I0904 01:01:04.356593 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qnq\" (UniqueName: \"kubernetes.io/projected/88673c6f-d652-4946-bae4-4808846ac0b8-kube-api-access-j7qnq\") pod \"whisker-6674d9f6c7-glbpj\" (UID: \"88673c6f-d652-4946-bae4-4808846ac0b8\") " pod="calico-system/whisker-6674d9f6c7-glbpj" Sep 4 01:01:04.357083 kubelet[3273]: I0904 01:01:04.356874 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/88673c6f-d652-4946-bae4-4808846ac0b8-whisker-backend-key-pair\") pod \"whisker-6674d9f6c7-glbpj\" (UID: \"88673c6f-d652-4946-bae4-4808846ac0b8\") " pod="calico-system/whisker-6674d9f6c7-glbpj" Sep 4 01:01:04.357083 kubelet[3273]: I0904 01:01:04.356995 3273 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88673c6f-d652-4946-bae4-4808846ac0b8-whisker-ca-bundle\") pod \"whisker-6674d9f6c7-glbpj\" (UID: \"88673c6f-d652-4946-bae4-4808846ac0b8\") " pod="calico-system/whisker-6674d9f6c7-glbpj" Sep 4 01:01:04.523096 containerd[1911]: time="2025-09-04T01:01:04.522970328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6674d9f6c7-glbpj,Uid:88673c6f-d652-4946-bae4-4808846ac0b8,Namespace:calico-system,Attempt:0,}" Sep 4 01:01:04.578871 systemd-networkd[1828]: cali47ca67fd607: Link UP Sep 4 01:01:04.579018 systemd-networkd[1828]: cali47ca67fd607: Gained carrier Sep 4 01:01:04.584449 containerd[1911]: 2025-09-04 01:01:04.534 [INFO][4801] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 01:01:04.584449 containerd[1911]: 2025-09-04 01:01:04.541 [INFO][4801] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0 whisker-6674d9f6c7- calico-system 88673c6f-d652-4946-bae4-4808846ac0b8 948 0 2025-09-04 01:01:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6674d9f6c7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c whisker-6674d9f6c7-glbpj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali47ca67fd607 [] [] }} ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-" Sep 4 01:01:04.584449 containerd[1911]: 2025-09-04 01:01:04.541 [INFO][4801] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584449 containerd[1911]: 2025-09-04 01:01:04.553 [INFO][4825] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" HandleID="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Workload="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.553 [INFO][4825] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" HandleID="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Workload="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000607260), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"whisker-6674d9f6c7-glbpj", "timestamp":"2025-09-04 01:01:04.553524277 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.553 [INFO][4825] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.553 [INFO][4825] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.553 [INFO][4825] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.558 [INFO][4825] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.561 [INFO][4825] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.564 [INFO][4825] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.565 [INFO][4825] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584602 containerd[1911]: 2025-09-04 01:01:04.567 [INFO][4825] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.567 [INFO][4825] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.568 [INFO][4825] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.570 [INFO][4825] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.573 [INFO][4825] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.65/26] block=192.168.124.64/26 handle="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.573 [INFO][4825] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.65/26] handle="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.573 [INFO][4825] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:04.584754 containerd[1911]: 2025-09-04 01:01:04.573 [INFO][4825] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.65/26] IPv6=[] ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" HandleID="k8s-pod-network.7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Workload="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584864 containerd[1911]: 2025-09-04 01:01:04.574 [INFO][4801] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0", GenerateName:"whisker-6674d9f6c7-", Namespace:"calico-system", SelfLink:"", UID:"88673c6f-d652-4946-bae4-4808846ac0b8", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 1, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6674d9f6c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"whisker-6674d9f6c7-glbpj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.124.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali47ca67fd607", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:04.584864 containerd[1911]: 2025-09-04 01:01:04.575 [INFO][4801] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.65/32] ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584920 containerd[1911]: 2025-09-04 01:01:04.575 [INFO][4801] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali47ca67fd607 ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584920 containerd[1911]: 2025-09-04 01:01:04.579 [INFO][4801] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.584962 containerd[1911]: 2025-09-04 01:01:04.579 [INFO][4801] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0", GenerateName:"whisker-6674d9f6c7-", Namespace:"calico-system", SelfLink:"", UID:"88673c6f-d652-4946-bae4-4808846ac0b8", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 1, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6674d9f6c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc", Pod:"whisker-6674d9f6c7-glbpj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.124.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali47ca67fd607", MAC:"7a:0f:99:e1:67:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:04.584999 containerd[1911]: 2025-09-04 01:01:04.583 [INFO][4801] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" Namespace="calico-system" Pod="whisker-6674d9f6c7-glbpj" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-whisker--6674d9f6c7--glbpj-eth0" Sep 4 01:01:04.592504 containerd[1911]: time="2025-09-04T01:01:04.592482210Z" level=info msg="connecting to shim 7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc" address="unix:///run/containerd/s/78abb0d0db0e19cb9b7cfa9ef5530f8c4781f267362a18b57dc2ec0f4254744b" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:04.614054 systemd[1]: Started cri-containerd-7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc.scope - libcontainer container 7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc. Sep 4 01:01:04.664015 containerd[1911]: time="2025-09-04T01:01:04.663947850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6674d9f6c7-glbpj,Uid:88673c6f-d652-4946-bae4-4808846ac0b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc\"" Sep 4 01:01:04.665484 containerd[1911]: time="2025-09-04T01:01:04.665462261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 01:01:04.891899 systemd-networkd[1828]: vxlan.calico: Link UP Sep 4 01:01:04.891903 systemd-networkd[1828]: vxlan.calico: Gained carrier Sep 4 01:01:04.983375 kubelet[3273]: I0904 01:01:04.983356 3273 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7" path="/var/lib/kubelet/pods/76666ee7-2b21-4bd8-a85d-bb73cd4c2ad7/volumes" Sep 4 01:01:05.253148 containerd[1911]: time="2025-09-04T01:01:05.253071373Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"2e987dc83918905bdf51475ac609a582e5851860e93a7bff82d9e9ff9e47c99d\" pid:5163 exit_status:1 exited_at:{seconds:1756947665 nanos:252774625}" Sep 4 01:01:05.973203 systemd-networkd[1828]: vxlan.calico: Gained IPv6LL Sep 4 01:01:06.239875 containerd[1911]: time="2025-09-04T01:01:06.239803642Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"9533097afd5f8b2dcb9eb8b28e1b4d0c2c0f4d8c403f20d2cb4a1e615abda4c8\" pid:5204 exit_status:1 exited_at:{seconds:1756947666 nanos:239642705}" Sep 4 01:01:06.293104 systemd-networkd[1828]: cali47ca67fd607: Gained IPv6LL Sep 4 01:01:06.984692 containerd[1911]: time="2025-09-04T01:01:06.984461822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xc5w2,Uid:8381ffae-bf3c-40d3-9391-b9f0872fdb03,Namespace:kube-system,Attempt:0,}" Sep 4 01:01:06.984692 containerd[1911]: time="2025-09-04T01:01:06.984454970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zkzwt,Uid:1719e2f2-7b21-4a85-b131-359f01b6a3ae,Namespace:kube-system,Attempt:0,}" Sep 4 01:01:07.054722 systemd-networkd[1828]: cali54a9100bf2c: Link UP Sep 4 01:01:07.054860 systemd-networkd[1828]: cali54a9100bf2c: Gained carrier Sep 4 01:01:07.059561 containerd[1911]: 2025-09-04 01:01:07.006 [INFO][5228] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0 coredns-674b8bbfcf- kube-system 8381ffae-bf3c-40d3-9391-b9f0872fdb03 867 0 2025-09-04 01:00:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c coredns-674b8bbfcf-xc5w2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali54a9100bf2c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-" Sep 4 01:01:07.059561 containerd[1911]: 2025-09-04 01:01:07.006 [INFO][5228] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.059561 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5275] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" HandleID="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5275] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" HandleID="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"coredns-674b8bbfcf-xc5w2", "timestamp":"2025-09-04 01:01:07.019148629 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5275] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5275] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5275] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.024 [INFO][5275] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.027 [INFO][5275] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.030 [INFO][5275] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.031 [INFO][5275] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059731 containerd[1911]: 2025-09-04 01:01:07.033 [INFO][5275] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.033 [INFO][5275] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.034 [INFO][5275] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918 Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.050 [INFO][5275] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5275] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.66/26] block=192.168.124.64/26 handle="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5275] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.66/26] handle="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5275] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:07.059919 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5275] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.66/26] IPv6=[] ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" HandleID="k8s-pod-network.6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5228] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"8381ffae-bf3c-40d3-9391-b9f0872fdb03", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"coredns-674b8bbfcf-xc5w2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54a9100bf2c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.054 [INFO][5228] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.66/32] ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.054 [INFO][5228] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali54a9100bf2c ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.054 [INFO][5228] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.055 [INFO][5228] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"8381ffae-bf3c-40d3-9391-b9f0872fdb03", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918", Pod:"coredns-674b8bbfcf-xc5w2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54a9100bf2c", MAC:"c6:41:d0:1a:ff:14", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:07.060044 containerd[1911]: 2025-09-04 01:01:07.058 [INFO][5228] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" Namespace="kube-system" Pod="coredns-674b8bbfcf-xc5w2" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--xc5w2-eth0" Sep 4 01:01:07.068190 containerd[1911]: time="2025-09-04T01:01:07.068162988Z" level=info msg="connecting to shim 6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918" address="unix:///run/containerd/s/69cc528313ffc7b50a55d84750e9b7b92383d3d19df9f1cc84ae1c832b5b7725" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:07.089001 systemd[1]: Started cri-containerd-6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918.scope - libcontainer container 6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918. Sep 4 01:01:07.114253 containerd[1911]: time="2025-09-04T01:01:07.114202278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xc5w2,Uid:8381ffae-bf3c-40d3-9391-b9f0872fdb03,Namespace:kube-system,Attempt:0,} returns sandbox id \"6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918\"" Sep 4 01:01:07.116338 containerd[1911]: time="2025-09-04T01:01:07.116316208Z" level=info msg="CreateContainer within sandbox \"6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 01:01:07.119338 containerd[1911]: time="2025-09-04T01:01:07.119322789Z" level=info msg="Container 9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:07.121582 containerd[1911]: time="2025-09-04T01:01:07.121569700Z" level=info msg="CreateContainer within sandbox \"6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506\"" Sep 4 01:01:07.121804 containerd[1911]: time="2025-09-04T01:01:07.121793694Z" level=info msg="StartContainer for \"9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506\"" Sep 4 01:01:07.122198 containerd[1911]: time="2025-09-04T01:01:07.122163479Z" level=info msg="connecting to shim 9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506" address="unix:///run/containerd/s/69cc528313ffc7b50a55d84750e9b7b92383d3d19df9f1cc84ae1c832b5b7725" protocol=ttrpc version=3 Sep 4 01:01:07.138125 systemd-networkd[1828]: caliaa0fe9220e3: Link UP Sep 4 01:01:07.138234 systemd-networkd[1828]: caliaa0fe9220e3: Gained carrier Sep 4 01:01:07.139896 systemd[1]: Started cri-containerd-9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506.scope - libcontainer container 9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506. Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.007 [INFO][5240] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0 coredns-674b8bbfcf- kube-system 1719e2f2-7b21-4a85-b131-359f01b6a3ae 870 0 2025-09-04 01:00:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c coredns-674b8bbfcf-zkzwt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliaa0fe9220e3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.007 [INFO][5240] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5277] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" HandleID="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5277] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" HandleID="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000351770), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"coredns-674b8bbfcf-zkzwt", "timestamp":"2025-09-04 01:01:07.019152249 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.019 [INFO][5277] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5277] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.053 [INFO][5277] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.124 [INFO][5277] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.128 [INFO][5277] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.130 [INFO][5277] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.130 [INFO][5277] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.131 [INFO][5277] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.131 [INFO][5277] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.132 [INFO][5277] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.134 [INFO][5277] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.136 [INFO][5277] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.67/26] block=192.168.124.64/26 handle="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.136 [INFO][5277] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.67/26] handle="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.136 [INFO][5277] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:07.143859 containerd[1911]: 2025-09-04 01:01:07.136 [INFO][5277] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.67/26] IPv6=[] ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" HandleID="k8s-pod-network.139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Workload="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.137 [INFO][5240] cni-plugin/k8s.go 418: Populated endpoint ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1719e2f2-7b21-4a85-b131-359f01b6a3ae", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"coredns-674b8bbfcf-zkzwt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaa0fe9220e3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.137 [INFO][5240] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.67/32] ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.137 [INFO][5240] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaa0fe9220e3 ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.138 [INFO][5240] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.138 [INFO][5240] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1719e2f2-7b21-4a85-b131-359f01b6a3ae", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd", Pod:"coredns-674b8bbfcf-zkzwt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaa0fe9220e3", MAC:"56:02:e9:c0:37:bb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:07.144282 containerd[1911]: 2025-09-04 01:01:07.142 [INFO][5240] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" Namespace="kube-system" Pod="coredns-674b8bbfcf-zkzwt" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-coredns--674b8bbfcf--zkzwt-eth0" Sep 4 01:01:07.162004 containerd[1911]: time="2025-09-04T01:01:07.161978596Z" level=info msg="StartContainer for \"9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506\" returns successfully" Sep 4 01:01:07.167910 containerd[1911]: time="2025-09-04T01:01:07.167877543Z" level=info msg="connecting to shim 139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd" address="unix:///run/containerd/s/90ebeaf9f4ca5b5bd423ba63cff1787b936f735d43e49267a706fd8380aff18b" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:07.186673 kubelet[3273]: I0904 01:01:07.186636 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-xc5w2" podStartSLOduration=55.186624422 podStartE2EDuration="55.186624422s" podCreationTimestamp="2025-09-04 01:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:01:07.186385558 +0000 UTC m=+62.280259458" watchObservedRunningTime="2025-09-04 01:01:07.186624422 +0000 UTC m=+62.280498319" Sep 4 01:01:07.191911 systemd[1]: Started cri-containerd-139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd.scope - libcontainer container 139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd. Sep 4 01:01:07.220654 containerd[1911]: time="2025-09-04T01:01:07.220603001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zkzwt,Uid:1719e2f2-7b21-4a85-b131-359f01b6a3ae,Namespace:kube-system,Attempt:0,} returns sandbox id \"139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd\"" Sep 4 01:01:07.224597 containerd[1911]: time="2025-09-04T01:01:07.224581556Z" level=info msg="CreateContainer within sandbox \"139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 01:01:07.227476 containerd[1911]: time="2025-09-04T01:01:07.227431385Z" level=info msg="Container c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:07.229570 containerd[1911]: time="2025-09-04T01:01:07.229527746Z" level=info msg="CreateContainer within sandbox \"139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0\"" Sep 4 01:01:07.229748 containerd[1911]: time="2025-09-04T01:01:07.229737307Z" level=info msg="StartContainer for \"c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0\"" Sep 4 01:01:07.230179 containerd[1911]: time="2025-09-04T01:01:07.230165538Z" level=info msg="connecting to shim c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0" address="unix:///run/containerd/s/90ebeaf9f4ca5b5bd423ba63cff1787b936f735d43e49267a706fd8380aff18b" protocol=ttrpc version=3 Sep 4 01:01:07.245307 systemd[1]: Started cri-containerd-c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0.scope - libcontainer container c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0. Sep 4 01:01:07.296373 containerd[1911]: time="2025-09-04T01:01:07.296340853Z" level=info msg="StartContainer for \"c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0\" returns successfully" Sep 4 01:01:07.983173 containerd[1911]: time="2025-09-04T01:01:07.983110209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-7r7n6,Uid:8b334fed-4398-41d5-a0d5-cb7e7c6b3608,Namespace:calico-apiserver,Attempt:0,}" Sep 4 01:01:07.983289 containerd[1911]: time="2025-09-04T01:01:07.983119083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff5f6d785-tbvw4,Uid:a5cd7763-0c95-480d-bd8f-2abfda1f515e,Namespace:calico-system,Attempt:0,}" Sep 4 01:01:08.038627 systemd-networkd[1828]: cali324f6c3e6c7: Link UP Sep 4 01:01:08.038759 systemd-networkd[1828]: cali324f6c3e6c7: Gained carrier Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.002 [INFO][5521] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0 calico-kube-controllers-ff5f6d785- calico-system a5cd7763-0c95-480d-bd8f-2abfda1f515e 866 0 2025-09-04 01:00:23 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:ff5f6d785 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c calico-kube-controllers-ff5f6d785-tbvw4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali324f6c3e6c7 [] [] }} ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.002 [INFO][5521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5559] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" HandleID="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5559] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" HandleID="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000711b90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"calico-kube-controllers-ff5f6d785-tbvw4", "timestamp":"2025-09-04 01:01:08.016151903 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5559] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5559] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5559] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.021 [INFO][5559] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.024 [INFO][5559] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.028 [INFO][5559] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.031 [INFO][5559] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.032 [INFO][5559] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.032 [INFO][5559] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.033 [INFO][5559] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00 Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.034 [INFO][5559] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5559] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.68/26] block=192.168.124.64/26 handle="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5559] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.68/26] handle="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5559] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:08.043429 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5559] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.68/26] IPv6=[] ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" HandleID="k8s-pod-network.34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.037 [INFO][5521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0", GenerateName:"calico-kube-controllers-ff5f6d785-", Namespace:"calico-system", SelfLink:"", UID:"a5cd7763-0c95-480d-bd8f-2abfda1f515e", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff5f6d785", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"calico-kube-controllers-ff5f6d785-tbvw4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.124.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali324f6c3e6c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.037 [INFO][5521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.68/32] ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.037 [INFO][5521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali324f6c3e6c7 ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.038 [INFO][5521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.039 [INFO][5521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0", GenerateName:"calico-kube-controllers-ff5f6d785-", Namespace:"calico-system", SelfLink:"", UID:"a5cd7763-0c95-480d-bd8f-2abfda1f515e", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff5f6d785", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00", Pod:"calico-kube-controllers-ff5f6d785-tbvw4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.124.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali324f6c3e6c7", MAC:"26:d6:c5:e1:dd:1b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:08.043862 containerd[1911]: 2025-09-04 01:01:08.042 [INFO][5521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" Namespace="calico-system" Pod="calico-kube-controllers-ff5f6d785-tbvw4" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--kube--controllers--ff5f6d785--tbvw4-eth0" Sep 4 01:01:08.051720 containerd[1911]: time="2025-09-04T01:01:08.051693837Z" level=info msg="connecting to shim 34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00" address="unix:///run/containerd/s/75a92b770188c42c87facfe2e2780ebcee1d4954126ca0aedb9828f6674699f3" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:08.074058 systemd[1]: Started cri-containerd-34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00.scope - libcontainer container 34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00. Sep 4 01:01:08.121036 containerd[1911]: time="2025-09-04T01:01:08.120984667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff5f6d785-tbvw4,Uid:a5cd7763-0c95-480d-bd8f-2abfda1f515e,Namespace:calico-system,Attempt:0,} returns sandbox id \"34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00\"" Sep 4 01:01:08.156805 systemd-networkd[1828]: califa3b6a9e37a: Link UP Sep 4 01:01:08.157041 systemd-networkd[1828]: califa3b6a9e37a: Gained carrier Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.003 [INFO][5515] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0 calico-apiserver-6b6dc69f69- calico-apiserver 8b334fed-4398-41d5-a0d5-cb7e7c6b3608 871 0 2025-09-04 01:00:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b6dc69f69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c calico-apiserver-6b6dc69f69-7r7n6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califa3b6a9e37a [] [] }} ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.003 [INFO][5515] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5561] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" HandleID="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5561] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" HandleID="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"calico-apiserver-6b6dc69f69-7r7n6", "timestamp":"2025-09-04 01:01:08.016019597 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.016 [INFO][5561] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5561] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.036 [INFO][5561] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.122 [INFO][5561] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.124 [INFO][5561] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.128 [INFO][5561] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.129 [INFO][5561] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.130 [INFO][5561] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.130 [INFO][5561] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.131 [INFO][5561] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0 Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.132 [INFO][5561] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.154 [INFO][5561] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.69/26] block=192.168.124.64/26 handle="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.154 [INFO][5561] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.69/26] handle="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.154 [INFO][5561] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:08.164881 containerd[1911]: 2025-09-04 01:01:08.154 [INFO][5561] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.69/26] IPv6=[] ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" HandleID="k8s-pod-network.4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.155 [INFO][5515] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0", GenerateName:"calico-apiserver-6b6dc69f69-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b334fed-4398-41d5-a0d5-cb7e7c6b3608", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b6dc69f69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"calico-apiserver-6b6dc69f69-7r7n6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califa3b6a9e37a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.155 [INFO][5515] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.69/32] ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.155 [INFO][5515] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califa3b6a9e37a ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.157 [INFO][5515] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.157 [INFO][5515] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0", GenerateName:"calico-apiserver-6b6dc69f69-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b334fed-4398-41d5-a0d5-cb7e7c6b3608", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b6dc69f69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0", Pod:"calico-apiserver-6b6dc69f69-7r7n6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califa3b6a9e37a", MAC:"ee:24:2d:62:1f:8d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:08.165463 containerd[1911]: 2025-09-04 01:01:08.163 [INFO][5515] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-7r7n6" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--7r7n6-eth0" Sep 4 01:01:08.173050 containerd[1911]: time="2025-09-04T01:01:08.173026480Z" level=info msg="connecting to shim 4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0" address="unix:///run/containerd/s/e646aa7078ecae87faae2b73c0982461ef3386ac61951509bb7e9703b51636c3" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:08.188005 kubelet[3273]: I0904 01:01:08.187973 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zkzwt" podStartSLOduration=56.187962158 podStartE2EDuration="56.187962158s" podCreationTimestamp="2025-09-04 01:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 01:01:08.187742657 +0000 UTC m=+63.281616567" watchObservedRunningTime="2025-09-04 01:01:08.187962158 +0000 UTC m=+63.281836055" Sep 4 01:01:08.189919 systemd[1]: Started cri-containerd-4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0.scope - libcontainer container 4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0. Sep 4 01:01:08.215672 containerd[1911]: time="2025-09-04T01:01:08.215651782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-7r7n6,Uid:8b334fed-4398-41d5-a0d5-cb7e7c6b3608,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0\"" Sep 4 01:01:08.644250 containerd[1911]: time="2025-09-04T01:01:08.644226889Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:08.644493 containerd[1911]: time="2025-09-04T01:01:08.644410308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 01:01:08.644675 containerd[1911]: time="2025-09-04T01:01:08.644663096Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:08.645613 containerd[1911]: time="2025-09-04T01:01:08.645602056Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:08.646275 containerd[1911]: time="2025-09-04T01:01:08.646263092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 3.980679086s" Sep 4 01:01:08.646301 containerd[1911]: time="2025-09-04T01:01:08.646278225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 01:01:08.646665 containerd[1911]: time="2025-09-04T01:01:08.646653581Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 01:01:08.647614 containerd[1911]: time="2025-09-04T01:01:08.647603231Z" level=info msg="CreateContainer within sandbox \"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 01:01:08.650034 containerd[1911]: time="2025-09-04T01:01:08.650023881Z" level=info msg="Container 5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:08.652547 containerd[1911]: time="2025-09-04T01:01:08.652507488Z" level=info msg="CreateContainer within sandbox \"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3\"" Sep 4 01:01:08.652720 containerd[1911]: time="2025-09-04T01:01:08.652707500Z" level=info msg="StartContainer for \"5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3\"" Sep 4 01:01:08.653231 containerd[1911]: time="2025-09-04T01:01:08.653220570Z" level=info msg="connecting to shim 5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3" address="unix:///run/containerd/s/78abb0d0db0e19cb9b7cfa9ef5530f8c4781f267362a18b57dc2ec0f4254744b" protocol=ttrpc version=3 Sep 4 01:01:08.667079 systemd[1]: Started cri-containerd-5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3.scope - libcontainer container 5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3. Sep 4 01:01:08.696229 containerd[1911]: time="2025-09-04T01:01:08.696208083Z" level=info msg="StartContainer for \"5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3\" returns successfully" Sep 4 01:01:09.044995 systemd-networkd[1828]: cali54a9100bf2c: Gained IPv6LL Sep 4 01:01:09.173074 systemd-networkd[1828]: caliaa0fe9220e3: Gained IPv6LL Sep 4 01:01:09.301064 systemd-networkd[1828]: cali324f6c3e6c7: Gained IPv6LL Sep 4 01:01:09.749085 systemd-networkd[1828]: califa3b6a9e37a: Gained IPv6LL Sep 4 01:01:09.984360 containerd[1911]: time="2025-09-04T01:01:09.984223777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-skh28,Uid:729b509f-0f97-4ec9-afbb-7f6a547a08ac,Namespace:calico-apiserver,Attempt:0,}" Sep 4 01:01:09.985220 containerd[1911]: time="2025-09-04T01:01:09.984223785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-t2rrl,Uid:7fed3011-6531-4d2d-a58f-628d8600da48,Namespace:calico-system,Attempt:0,}" Sep 4 01:01:09.985220 containerd[1911]: time="2025-09-04T01:01:09.984484675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89qxc,Uid:427476d1-8991-4f95-a430-df7e5fb9ed65,Namespace:calico-system,Attempt:0,}" Sep 4 01:01:10.040429 systemd-networkd[1828]: calif0a44a3eb6e: Link UP Sep 4 01:01:10.040547 systemd-networkd[1828]: calif0a44a3eb6e: Gained carrier Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.006 [INFO][5759] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0 goldmane-54d579b49d- calico-system 7fed3011-6531-4d2d-a58f-628d8600da48 872 0 2025-09-04 01:00:23 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c goldmane-54d579b49d-t2rrl eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif0a44a3eb6e [] [] }} ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.006 [INFO][5759] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.018 [INFO][5836] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" HandleID="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Workload="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.018 [INFO][5836] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" HandleID="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Workload="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138250), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"goldmane-54d579b49d-t2rrl", "timestamp":"2025-09-04 01:01:10.018817924 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.019 [INFO][5836] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.019 [INFO][5836] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.019 [INFO][5836] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.023 [INFO][5836] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.027 [INFO][5836] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.030 [INFO][5836] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.031 [INFO][5836] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.032 [INFO][5836] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.032 [INFO][5836] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.033 [INFO][5836] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.036 [INFO][5836] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5836] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.70/26] block=192.168.124.64/26 handle="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5836] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.70/26] handle="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5836] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:10.045352 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5836] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.70/26] IPv6=[] ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" HandleID="k8s-pod-network.eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Workload="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.039 [INFO][5759] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7fed3011-6531-4d2d-a58f-628d8600da48", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"goldmane-54d579b49d-t2rrl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.124.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif0a44a3eb6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.039 [INFO][5759] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.70/32] ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.039 [INFO][5759] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0a44a3eb6e ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.040 [INFO][5759] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.040 [INFO][5759] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7fed3011-6531-4d2d-a58f-628d8600da48", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e", Pod:"goldmane-54d579b49d-t2rrl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.124.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif0a44a3eb6e", MAC:"82:00:04:f0:10:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.045788 containerd[1911]: 2025-09-04 01:01:10.044 [INFO][5759] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" Namespace="calico-system" Pod="goldmane-54d579b49d-t2rrl" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-goldmane--54d579b49d--t2rrl-eth0" Sep 4 01:01:10.052842 containerd[1911]: time="2025-09-04T01:01:10.052810729Z" level=info msg="connecting to shim eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e" address="unix:///run/containerd/s/1b13853f63ce64eef32a2410fd0e996cdaf590eb151fa8d7f203c635e2c31ab4" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:10.078979 systemd[1]: Started cri-containerd-eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e.scope - libcontainer container eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e. Sep 4 01:01:10.109801 containerd[1911]: time="2025-09-04T01:01:10.109739531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-t2rrl,Uid:7fed3011-6531-4d2d-a58f-628d8600da48,Namespace:calico-system,Attempt:0,} returns sandbox id \"eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e\"" Sep 4 01:01:10.199035 systemd-networkd[1828]: calid1de6aa1c45: Link UP Sep 4 01:01:10.199687 systemd-networkd[1828]: calid1de6aa1c45: Gained carrier Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.004 [INFO][5754] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0 calico-apiserver-6b6dc69f69- calico-apiserver 729b509f-0f97-4ec9-afbb-7f6a547a08ac 868 0 2025-09-04 01:00:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b6dc69f69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c calico-apiserver-6b6dc69f69-skh28 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid1de6aa1c45 [] [] }} ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.004 [INFO][5754] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.019 [INFO][5828] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" HandleID="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.020 [INFO][5828] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" HandleID="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003cf200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"calico-apiserver-6b6dc69f69-skh28", "timestamp":"2025-09-04 01:01:10.019946228 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.020 [INFO][5828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.038 [INFO][5828] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.128 [INFO][5828] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.137 [INFO][5828] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.146 [INFO][5828] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.150 [INFO][5828] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.155 [INFO][5828] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.155 [INFO][5828] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.158 [INFO][5828] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181 Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.179 [INFO][5828] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5828] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.71/26] block=192.168.124.64/26 handle="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5828] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.71/26] handle="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:10.212251 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5828] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.71/26] IPv6=[] ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" HandleID="k8s-pod-network.4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Workload="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.195 [INFO][5754] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0", GenerateName:"calico-apiserver-6b6dc69f69-", Namespace:"calico-apiserver", SelfLink:"", UID:"729b509f-0f97-4ec9-afbb-7f6a547a08ac", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b6dc69f69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"calico-apiserver-6b6dc69f69-skh28", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1de6aa1c45", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.196 [INFO][5754] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.71/32] ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.196 [INFO][5754] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid1de6aa1c45 ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.200 [INFO][5754] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.200 [INFO][5754] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0", GenerateName:"calico-apiserver-6b6dc69f69-", Namespace:"calico-apiserver", SelfLink:"", UID:"729b509f-0f97-4ec9-afbb-7f6a547a08ac", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b6dc69f69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181", Pod:"calico-apiserver-6b6dc69f69-skh28", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1de6aa1c45", MAC:"da:4f:cc:ef:7f:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.214029 containerd[1911]: 2025-09-04 01:01:10.209 [INFO][5754] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" Namespace="calico-apiserver" Pod="calico-apiserver-6b6dc69f69-skh28" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-calico--apiserver--6b6dc69f69--skh28-eth0" Sep 4 01:01:10.238516 containerd[1911]: time="2025-09-04T01:01:10.238491195Z" level=info msg="connecting to shim 4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181" address="unix:///run/containerd/s/b443fb534a02a52c0025ceb74da150547befa8ed0c42308a72104ba53ea3bb8d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:10.252307 systemd-networkd[1828]: cali0361bbf1c68: Link UP Sep 4 01:01:10.252511 systemd-networkd[1828]: cali0361bbf1c68: Gained carrier Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.005 [INFO][5770] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0 csi-node-driver- calico-system 427476d1-8991-4f95-a430-df7e5fb9ed65 698 0 2025-09-04 01:00:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-n-d84d506c1c csi-node-driver-89qxc eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0361bbf1c68 [] [] }} ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.006 [INFO][5770] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.020 [INFO][5834] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" HandleID="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Workload="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.020 [INFO][5834] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" HandleID="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Workload="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139550), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-d84d506c1c", "pod":"csi-node-driver-89qxc", "timestamp":"2025-09-04 01:01:10.020807074 +0000 UTC"}, Hostname:"ci-4372.1.0-n-d84d506c1c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.020 [INFO][5834] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5834] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.191 [INFO][5834] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-d84d506c1c' Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.225 [INFO][5834] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.236 [INFO][5834] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.242 [INFO][5834] ipam/ipam.go 511: Trying affinity for 192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.243 [INFO][5834] ipam/ipam.go 158: Attempting to load block cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.244 [INFO][5834] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.124.64/26 host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.244 [INFO][5834] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.124.64/26 handle="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.245 [INFO][5834] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201 Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.247 [INFO][5834] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.124.64/26 handle="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.250 [INFO][5834] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.124.72/26] block=192.168.124.64/26 handle="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.250 [INFO][5834] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.124.72/26] handle="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" host="ci-4372.1.0-n-d84d506c1c" Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.250 [INFO][5834] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 01:01:10.257711 containerd[1911]: 2025-09-04 01:01:10.250 [INFO][5834] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.124.72/26] IPv6=[] ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" HandleID="k8s-pod-network.113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Workload="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.251 [INFO][5770] cni-plugin/k8s.go 418: Populated endpoint ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"427476d1-8991-4f95-a430-df7e5fb9ed65", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"", Pod:"csi-node-driver-89qxc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.124.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0361bbf1c68", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.251 [INFO][5770] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.72/32] ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.251 [INFO][5770] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0361bbf1c68 ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.252 [INFO][5770] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.252 [INFO][5770] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"427476d1-8991-4f95-a430-df7e5fb9ed65", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 1, 0, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-d84d506c1c", ContainerID:"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201", Pod:"csi-node-driver-89qxc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.124.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0361bbf1c68", MAC:"3e:3a:62:29:ed:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 01:01:10.258148 containerd[1911]: 2025-09-04 01:01:10.256 [INFO][5770] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" Namespace="calico-system" Pod="csi-node-driver-89qxc" WorkloadEndpoint="ci--4372.1.0--n--d84d506c1c-k8s-csi--node--driver--89qxc-eth0" Sep 4 01:01:10.261890 systemd[1]: Started cri-containerd-4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181.scope - libcontainer container 4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181. Sep 4 01:01:10.265261 containerd[1911]: time="2025-09-04T01:01:10.265232520Z" level=info msg="connecting to shim 113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201" address="unix:///run/containerd/s/57ba19fcbcaa2d4b65066ea4025a98209e34cc4ad10107a9f19f80accf0cee4e" namespace=k8s.io protocol=ttrpc version=3 Sep 4 01:01:10.272942 systemd[1]: Started cri-containerd-113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201.scope - libcontainer container 113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201. Sep 4 01:01:10.283815 containerd[1911]: time="2025-09-04T01:01:10.283790442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89qxc,Uid:427476d1-8991-4f95-a430-df7e5fb9ed65,Namespace:calico-system,Attempt:0,} returns sandbox id \"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201\"" Sep 4 01:01:10.288412 containerd[1911]: time="2025-09-04T01:01:10.288392417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b6dc69f69-skh28,Uid:729b509f-0f97-4ec9-afbb-7f6a547a08ac,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181\"" Sep 4 01:01:11.350162 systemd-networkd[1828]: calid1de6aa1c45: Gained IPv6LL Sep 4 01:01:11.733062 systemd-networkd[1828]: calif0a44a3eb6e: Gained IPv6LL Sep 4 01:01:11.797066 systemd-networkd[1828]: cali0361bbf1c68: Gained IPv6LL Sep 4 01:01:13.527199 containerd[1911]: time="2025-09-04T01:01:13.527148111Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:13.527415 containerd[1911]: time="2025-09-04T01:01:13.527381288Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 01:01:13.527748 containerd[1911]: time="2025-09-04T01:01:13.527713680Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:13.528521 containerd[1911]: time="2025-09-04T01:01:13.528474058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:13.529136 containerd[1911]: time="2025-09-04T01:01:13.529093289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.882423076s" Sep 4 01:01:13.529136 containerd[1911]: time="2025-09-04T01:01:13.529110465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 01:01:13.529538 containerd[1911]: time="2025-09-04T01:01:13.529526867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 01:01:13.533001 containerd[1911]: time="2025-09-04T01:01:13.532986779Z" level=info msg="CreateContainer within sandbox \"34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 01:01:13.535705 containerd[1911]: time="2025-09-04T01:01:13.535692384Z" level=info msg="Container f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:13.538436 containerd[1911]: time="2025-09-04T01:01:13.538420439Z" level=info msg="CreateContainer within sandbox \"34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\"" Sep 4 01:01:13.538674 containerd[1911]: time="2025-09-04T01:01:13.538631637Z" level=info msg="StartContainer for \"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\"" Sep 4 01:01:13.539197 containerd[1911]: time="2025-09-04T01:01:13.539161394Z" level=info msg="connecting to shim f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb" address="unix:///run/containerd/s/75a92b770188c42c87facfe2e2780ebcee1d4954126ca0aedb9828f6674699f3" protocol=ttrpc version=3 Sep 4 01:01:13.557037 systemd[1]: Started cri-containerd-f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb.scope - libcontainer container f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb. Sep 4 01:01:13.583870 containerd[1911]: time="2025-09-04T01:01:13.583842787Z" level=info msg="StartContainer for \"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" returns successfully" Sep 4 01:01:14.243602 kubelet[3273]: I0904 01:01:14.243482 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-ff5f6d785-tbvw4" podStartSLOduration=45.835581761 podStartE2EDuration="51.243443563s" podCreationTimestamp="2025-09-04 01:00:23 +0000 UTC" firstStartedPulling="2025-09-04 01:01:08.12161426 +0000 UTC m=+63.215488157" lastFinishedPulling="2025-09-04 01:01:13.529476059 +0000 UTC m=+68.623349959" observedRunningTime="2025-09-04 01:01:14.242367557 +0000 UTC m=+69.336241538" watchObservedRunningTime="2025-09-04 01:01:14.243443563 +0000 UTC m=+69.337317511" Sep 4 01:01:14.306247 containerd[1911]: time="2025-09-04T01:01:14.306214697Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"9a93e8850523c91a5a07014a14671294cd6b11b36bd8c661bdb38fde9a5fe38f\" pid:6127 exited_at:{seconds:1756947674 nanos:306023470}" Sep 4 01:01:19.041385 containerd[1911]: time="2025-09-04T01:01:19.041331878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:19.041593 containerd[1911]: time="2025-09-04T01:01:19.041530247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 01:01:19.041985 containerd[1911]: time="2025-09-04T01:01:19.041944484Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:19.042767 containerd[1911]: time="2025-09-04T01:01:19.042723078Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:19.043111 containerd[1911]: time="2025-09-04T01:01:19.043072962Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 5.51353171s" Sep 4 01:01:19.043111 containerd[1911]: time="2025-09-04T01:01:19.043090660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 01:01:19.043504 containerd[1911]: time="2025-09-04T01:01:19.043491427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 01:01:19.044512 containerd[1911]: time="2025-09-04T01:01:19.044498150Z" level=info msg="CreateContainer within sandbox \"4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 01:01:19.046996 containerd[1911]: time="2025-09-04T01:01:19.046954815Z" level=info msg="Container 2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:19.049539 containerd[1911]: time="2025-09-04T01:01:19.049524797Z" level=info msg="CreateContainer within sandbox \"4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09\"" Sep 4 01:01:19.049821 containerd[1911]: time="2025-09-04T01:01:19.049763115Z" level=info msg="StartContainer for \"2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09\"" Sep 4 01:01:19.050336 containerd[1911]: time="2025-09-04T01:01:19.050295480Z" level=info msg="connecting to shim 2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09" address="unix:///run/containerd/s/e646aa7078ecae87faae2b73c0982461ef3386ac61951509bb7e9703b51636c3" protocol=ttrpc version=3 Sep 4 01:01:19.066048 systemd[1]: Started cri-containerd-2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09.scope - libcontainer container 2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09. Sep 4 01:01:19.097173 containerd[1911]: time="2025-09-04T01:01:19.097124644Z" level=info msg="StartContainer for \"2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09\" returns successfully" Sep 4 01:01:19.239129 kubelet[3273]: I0904 01:01:19.239092 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b6dc69f69-7r7n6" podStartSLOduration=47.41186826 podStartE2EDuration="58.239083096s" podCreationTimestamp="2025-09-04 01:00:21 +0000 UTC" firstStartedPulling="2025-09-04 01:01:08.216234081 +0000 UTC m=+63.310107978" lastFinishedPulling="2025-09-04 01:01:19.043448912 +0000 UTC m=+74.137322814" observedRunningTime="2025-09-04 01:01:19.238713015 +0000 UTC m=+74.332586916" watchObservedRunningTime="2025-09-04 01:01:19.239083096 +0000 UTC m=+74.332956992" Sep 4 01:01:23.021590 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3874120486.mount: Deactivated successfully. Sep 4 01:01:23.025582 containerd[1911]: time="2025-09-04T01:01:23.025563633Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:23.025768 containerd[1911]: time="2025-09-04T01:01:23.025753956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 01:01:23.026118 containerd[1911]: time="2025-09-04T01:01:23.026106522Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:23.026988 containerd[1911]: time="2025-09-04T01:01:23.026974702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:23.027346 containerd[1911]: time="2025-09-04T01:01:23.027335630Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.98383129s" Sep 4 01:01:23.027367 containerd[1911]: time="2025-09-04T01:01:23.027350537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 01:01:23.027860 containerd[1911]: time="2025-09-04T01:01:23.027826336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 01:01:23.028827 containerd[1911]: time="2025-09-04T01:01:23.028814830Z" level=info msg="CreateContainer within sandbox \"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 01:01:23.031189 containerd[1911]: time="2025-09-04T01:01:23.031175258Z" level=info msg="Container 3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:23.033964 containerd[1911]: time="2025-09-04T01:01:23.033950165Z" level=info msg="CreateContainer within sandbox \"7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544\"" Sep 4 01:01:23.034195 containerd[1911]: time="2025-09-04T01:01:23.034180003Z" level=info msg="StartContainer for \"3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544\"" Sep 4 01:01:23.034698 containerd[1911]: time="2025-09-04T01:01:23.034684842Z" level=info msg="connecting to shim 3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544" address="unix:///run/containerd/s/78abb0d0db0e19cb9b7cfa9ef5530f8c4781f267362a18b57dc2ec0f4254744b" protocol=ttrpc version=3 Sep 4 01:01:23.058994 systemd[1]: Started cri-containerd-3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544.scope - libcontainer container 3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544. Sep 4 01:01:23.087518 containerd[1911]: time="2025-09-04T01:01:23.087497308Z" level=info msg="StartContainer for \"3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544\" returns successfully" Sep 4 01:01:23.276581 kubelet[3273]: I0904 01:01:23.276323 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6674d9f6c7-glbpj" podStartSLOduration=0.913626216 podStartE2EDuration="19.276287589s" podCreationTimestamp="2025-09-04 01:01:04 +0000 UTC" firstStartedPulling="2025-09-04 01:01:04.665094487 +0000 UTC m=+59.758968393" lastFinishedPulling="2025-09-04 01:01:23.027755865 +0000 UTC m=+78.121629766" observedRunningTime="2025-09-04 01:01:23.275638538 +0000 UTC m=+78.369512532" watchObservedRunningTime="2025-09-04 01:01:23.276287589 +0000 UTC m=+78.370161539" Sep 4 01:01:25.276404 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3179156554.mount: Deactivated successfully. Sep 4 01:01:25.514902 containerd[1911]: time="2025-09-04T01:01:25.514854894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:25.515097 containerd[1911]: time="2025-09-04T01:01:25.515017360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 01:01:25.515503 containerd[1911]: time="2025-09-04T01:01:25.515466323Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:25.516337 containerd[1911]: time="2025-09-04T01:01:25.516297395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:25.516722 containerd[1911]: time="2025-09-04T01:01:25.516686391Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.488843353s" Sep 4 01:01:25.516722 containerd[1911]: time="2025-09-04T01:01:25.516701848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 01:01:25.517121 containerd[1911]: time="2025-09-04T01:01:25.517083189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 01:01:25.518056 containerd[1911]: time="2025-09-04T01:01:25.518044882Z" level=info msg="CreateContainer within sandbox \"eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 01:01:25.520713 containerd[1911]: time="2025-09-04T01:01:25.520676964Z" level=info msg="Container 12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:25.524006 containerd[1911]: time="2025-09-04T01:01:25.523959588Z" level=info msg="CreateContainer within sandbox \"eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\"" Sep 4 01:01:25.524449 containerd[1911]: time="2025-09-04T01:01:25.524435799Z" level=info msg="StartContainer for \"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\"" Sep 4 01:01:25.524974 containerd[1911]: time="2025-09-04T01:01:25.524961543Z" level=info msg="connecting to shim 12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c" address="unix:///run/containerd/s/1b13853f63ce64eef32a2410fd0e996cdaf590eb151fa8d7f203c635e2c31ab4" protocol=ttrpc version=3 Sep 4 01:01:25.543905 systemd[1]: Started cri-containerd-12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c.scope - libcontainer container 12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c. Sep 4 01:01:25.571844 containerd[1911]: time="2025-09-04T01:01:25.571821037Z" level=info msg="StartContainer for \"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" returns successfully" Sep 4 01:01:26.291462 kubelet[3273]: I0904 01:01:26.291346 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-t2rrl" podStartSLOduration=47.8855004 podStartE2EDuration="1m3.29130815s" podCreationTimestamp="2025-09-04 01:00:23 +0000 UTC" firstStartedPulling="2025-09-04 01:01:10.111232539 +0000 UTC m=+65.205106437" lastFinishedPulling="2025-09-04 01:01:25.517040289 +0000 UTC m=+80.610914187" observedRunningTime="2025-09-04 01:01:26.290072178 +0000 UTC m=+81.383946219" watchObservedRunningTime="2025-09-04 01:01:26.29130815 +0000 UTC m=+81.385182098" Sep 4 01:01:26.376038 containerd[1911]: time="2025-09-04T01:01:26.376013275Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"1cf54a2e102afbe4f2d1feccf5ad1a5432e2894d370025e4336f45a9a0ef6c73\" pid:6335 exit_status:1 exited_at:{seconds:1756947686 nanos:375791761}" Sep 4 01:01:26.988237 containerd[1911]: time="2025-09-04T01:01:26.988216057Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:26.988604 containerd[1911]: time="2025-09-04T01:01:26.988487064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 01:01:26.988790 containerd[1911]: time="2025-09-04T01:01:26.988777760Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:26.989691 containerd[1911]: time="2025-09-04T01:01:26.989677352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:26.990117 containerd[1911]: time="2025-09-04T01:01:26.990100388Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.473003722s" Sep 4 01:01:26.990159 containerd[1911]: time="2025-09-04T01:01:26.990117908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 01:01:26.990671 containerd[1911]: time="2025-09-04T01:01:26.990642429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 01:01:26.991760 containerd[1911]: time="2025-09-04T01:01:26.991743436Z" level=info msg="CreateContainer within sandbox \"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 01:01:26.995583 containerd[1911]: time="2025-09-04T01:01:26.995539821Z" level=info msg="Container 609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:26.999183 containerd[1911]: time="2025-09-04T01:01:26.999139583Z" level=info msg="CreateContainer within sandbox \"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772\"" Sep 4 01:01:26.999527 containerd[1911]: time="2025-09-04T01:01:26.999474678Z" level=info msg="StartContainer for \"609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772\"" Sep 4 01:01:27.000514 containerd[1911]: time="2025-09-04T01:01:27.000482527Z" level=info msg="connecting to shim 609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772" address="unix:///run/containerd/s/57ba19fcbcaa2d4b65066ea4025a98209e34cc4ad10107a9f19f80accf0cee4e" protocol=ttrpc version=3 Sep 4 01:01:27.020084 systemd[1]: Started cri-containerd-609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772.scope - libcontainer container 609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772. Sep 4 01:01:27.040276 containerd[1911]: time="2025-09-04T01:01:27.040254600Z" level=info msg="StartContainer for \"609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772\" returns successfully" Sep 4 01:01:27.342858 containerd[1911]: time="2025-09-04T01:01:27.342796292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"c24d4d09eab2e784761014807123ed8627efad544bebcaee19cbaff498df542d\" pid:6404 exit_status:1 exited_at:{seconds:1756947687 nanos:342630807}" Sep 4 01:01:27.555908 containerd[1911]: time="2025-09-04T01:01:27.555883632Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:27.556109 containerd[1911]: time="2025-09-04T01:01:27.556093651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 01:01:27.557180 containerd[1911]: time="2025-09-04T01:01:27.557156852Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 566.480729ms" Sep 4 01:01:27.557180 containerd[1911]: time="2025-09-04T01:01:27.557172148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 01:01:27.557693 containerd[1911]: time="2025-09-04T01:01:27.557681497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 01:01:27.558903 containerd[1911]: time="2025-09-04T01:01:27.558866867Z" level=info msg="CreateContainer within sandbox \"4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 01:01:27.562373 containerd[1911]: time="2025-09-04T01:01:27.562335484Z" level=info msg="Container 845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:27.565325 containerd[1911]: time="2025-09-04T01:01:27.565289094Z" level=info msg="CreateContainer within sandbox \"4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab\"" Sep 4 01:01:27.565755 containerd[1911]: time="2025-09-04T01:01:27.565739602Z" level=info msg="StartContainer for \"845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab\"" Sep 4 01:01:27.566471 containerd[1911]: time="2025-09-04T01:01:27.566457382Z" level=info msg="connecting to shim 845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab" address="unix:///run/containerd/s/b443fb534a02a52c0025ceb74da150547befa8ed0c42308a72104ba53ea3bb8d" protocol=ttrpc version=3 Sep 4 01:01:27.589261 systemd[1]: Started cri-containerd-845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab.scope - libcontainer container 845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab. Sep 4 01:01:27.676490 containerd[1911]: time="2025-09-04T01:01:27.676426319Z" level=info msg="StartContainer for \"845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab\" returns successfully" Sep 4 01:01:28.298644 kubelet[3273]: I0904 01:01:28.298432 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b6dc69f69-skh28" podStartSLOduration=50.029643617 podStartE2EDuration="1m7.298371245s" podCreationTimestamp="2025-09-04 01:00:21 +0000 UTC" firstStartedPulling="2025-09-04 01:01:10.288884122 +0000 UTC m=+65.382758021" lastFinishedPulling="2025-09-04 01:01:27.557611749 +0000 UTC m=+82.651485649" observedRunningTime="2025-09-04 01:01:28.297390308 +0000 UTC m=+83.391264296" watchObservedRunningTime="2025-09-04 01:01:28.298371245 +0000 UTC m=+83.392245201" Sep 4 01:01:29.635446 containerd[1911]: time="2025-09-04T01:01:29.635391291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:29.635697 containerd[1911]: time="2025-09-04T01:01:29.635601266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 01:01:29.635996 containerd[1911]: time="2025-09-04T01:01:29.635955749Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:29.636695 containerd[1911]: time="2025-09-04T01:01:29.636659908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 01:01:29.637057 containerd[1911]: time="2025-09-04T01:01:29.637012075Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.07931676s" Sep 4 01:01:29.637057 containerd[1911]: time="2025-09-04T01:01:29.637029010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 01:01:29.638397 containerd[1911]: time="2025-09-04T01:01:29.638385913Z" level=info msg="CreateContainer within sandbox \"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 01:01:29.641587 containerd[1911]: time="2025-09-04T01:01:29.641545227Z" level=info msg="Container 71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6: CDI devices from CRI Config.CDIDevices: []" Sep 4 01:01:29.645031 containerd[1911]: time="2025-09-04T01:01:29.644989593Z" level=info msg="CreateContainer within sandbox \"113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6\"" Sep 4 01:01:29.645241 containerd[1911]: time="2025-09-04T01:01:29.645202722Z" level=info msg="StartContainer for \"71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6\"" Sep 4 01:01:29.645951 containerd[1911]: time="2025-09-04T01:01:29.645937756Z" level=info msg="connecting to shim 71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6" address="unix:///run/containerd/s/57ba19fcbcaa2d4b65066ea4025a98209e34cc4ad10107a9f19f80accf0cee4e" protocol=ttrpc version=3 Sep 4 01:01:29.666882 systemd[1]: Started cri-containerd-71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6.scope - libcontainer container 71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6. Sep 4 01:01:29.685409 containerd[1911]: time="2025-09-04T01:01:29.685358203Z" level=info msg="StartContainer for \"71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6\" returns successfully" Sep 4 01:01:30.050136 kubelet[3273]: I0904 01:01:30.050059 3273 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 01:01:30.051085 kubelet[3273]: I0904 01:01:30.050169 3273 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 01:01:36.280414 containerd[1911]: time="2025-09-04T01:01:36.280389714Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"b17876515dc96806214d15e15ca4fd9f05b10d116bf5e65062eae6e2a45d2d53\" pid:6539 exited_at:{seconds:1756947696 nanos:280174601}" Sep 4 01:01:36.290314 kubelet[3273]: I0904 01:01:36.290282 3273 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-89qxc" podStartSLOduration=53.937251363 podStartE2EDuration="1m13.290272823s" podCreationTimestamp="2025-09-04 01:00:23 +0000 UTC" firstStartedPulling="2025-09-04 01:01:10.284324803 +0000 UTC m=+65.378198702" lastFinishedPulling="2025-09-04 01:01:29.637346261 +0000 UTC m=+84.731220162" observedRunningTime="2025-09-04 01:01:30.318530243 +0000 UTC m=+85.412404237" watchObservedRunningTime="2025-09-04 01:01:36.290272823 +0000 UTC m=+91.384146720" Sep 4 01:01:44.259090 containerd[1911]: time="2025-09-04T01:01:44.259065103Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"1b60bafdb7485d7a2d3f509525c6d680959bf355b5742e1e80e5c89bcfd48b2b\" pid:6577 exited_at:{seconds:1756947704 nanos:258960688}" Sep 4 01:01:50.589340 containerd[1911]: time="2025-09-04T01:01:50.589305850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"759e1a5629feb1e6fc64165802167ca617952d1888ab5cad9b953aba6c48369a\" pid:6606 exited_at:{seconds:1756947710 nanos:589098538}" Sep 4 01:01:57.322051 containerd[1911]: time="2025-09-04T01:01:57.322001758Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"da3e1eedbb39df868e6b41d20c9612027bb301085f5de5dfec77f27736ffbf22\" pid:6628 exited_at:{seconds:1756947717 nanos:321804991}" Sep 4 01:02:06.266330 containerd[1911]: time="2025-09-04T01:02:06.266304960Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"5463259f69c9b3e7db694471943c2ed0df34932c16f34c246fe75e730737f4d2\" pid:6668 exited_at:{seconds:1756947726 nanos:266114211}" Sep 4 01:02:11.187926 containerd[1911]: time="2025-09-04T01:02:11.187900907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"bf412e1b470fd4f539f8d40059ee502a021bf4f9eb6bd859592ce08569b57dcb\" pid:6702 exited_at:{seconds:1756947731 nanos:187732791}" Sep 4 01:02:14.309893 containerd[1911]: time="2025-09-04T01:02:14.309865830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"45bd948170870f564fe3b9e09e9c83593e620256905bae31e2376390a5c1bd9e\" pid:6736 exited_at:{seconds:1756947734 nanos:309718942}" Sep 4 01:02:27.343877 containerd[1911]: time="2025-09-04T01:02:27.343809615Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"8cfa7374cb77f8e895fc4c6e633fefd82b366a3c2f7cbeb92e703cd86f6511a8\" pid:6763 exited_at:{seconds:1756947747 nanos:343597222}" Sep 4 01:02:36.274125 containerd[1911]: time="2025-09-04T01:02:36.274067609Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"bf7e8cdb3105dd89a1e61c11b12004729e1ca13fcb776b224686a883267e2137\" pid:6799 exited_at:{seconds:1756947756 nanos:273854992}" Sep 4 01:02:44.334759 containerd[1911]: time="2025-09-04T01:02:44.334717231Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"5fb7a8f69e415c6aaee2dc1e56951ef36137e25114a2d55b956cfb9307bf126b\" pid:6859 exited_at:{seconds:1756947764 nanos:334559547}" Sep 4 01:02:50.575454 containerd[1911]: time="2025-09-04T01:02:50.575427119Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"711cb9e9cb18a052e6903263986f1bb4a52360fa4ee26a47fc708398787cf724\" pid:6882 exited_at:{seconds:1756947770 nanos:575305443}" Sep 4 01:02:57.345431 containerd[1911]: time="2025-09-04T01:02:57.345405495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"8f11a5ab1ff2fa373b9e5a53fcc9bf3cf8f75e91a2e1dec8f3acc60263409df7\" pid:6903 exited_at:{seconds:1756947777 nanos:345208335}" Sep 4 01:03:06.298053 containerd[1911]: time="2025-09-04T01:03:06.297987747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"947d9f6cd2f65936e1232647730b26f30868ed93fc07c49aae80ecb7de5fb344\" pid:6940 exited_at:{seconds:1756947786 nanos:297747256}" Sep 4 01:03:11.192136 containerd[1911]: time="2025-09-04T01:03:11.192071954Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"bf0e0d6072b87d2c7086a1794b5b715068b62c5df76eb5920e46923a2afcd77a\" pid:6973 exited_at:{seconds:1756947791 nanos:191840747}" Sep 4 01:03:14.275517 containerd[1911]: time="2025-09-04T01:03:14.275489217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"0d9576f7a6ca25c14d795a80b3bf0ab3a24924b9fa522956868d89ef2e1a902d\" pid:7010 exited_at:{seconds:1756947794 nanos:275342977}" Sep 4 01:03:27.342855 containerd[1911]: time="2025-09-04T01:03:27.342801466Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"6072ba3984ea997a8e3fe823732aa924093dedc30a1f6689aa1cc18a54b2a4fd\" pid:7032 exited_at:{seconds:1756947807 nanos:342621176}" Sep 4 01:03:36.240189 containerd[1911]: time="2025-09-04T01:03:36.240134035Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"6b0305994f0bf13e351ec391465f37f50a99bc30d9fc97e174eb1e6f65c8f191\" pid:7065 exited_at:{seconds:1756947816 nanos:239920086}" Sep 4 01:03:44.269957 containerd[1911]: time="2025-09-04T01:03:44.269933095Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"55ef0cc8fd2ec75da8a24a0bfd51870e14c537daa2b456ae3fa296df9d7c1e82\" pid:7102 exited_at:{seconds:1756947824 nanos:269804734}" Sep 4 01:03:50.625190 containerd[1911]: time="2025-09-04T01:03:50.625160667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"1fed05fa365d4e97957198be142cfdc16f9cd93573ccda36e0818e9d865b22dd\" pid:7133 exited_at:{seconds:1756947830 nanos:625037940}" Sep 4 01:03:57.378181 containerd[1911]: time="2025-09-04T01:03:57.378114016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"c332d8eb3fe5a8586803a262ff6aeacc436d37c3a286c90d7e6b97294c15c697\" pid:7154 exited_at:{seconds:1756947837 nanos:377896838}" Sep 4 01:04:04.953235 update_engine[1905]: I20250904 01:04:04.952996 1905 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 4 01:04:04.953235 update_engine[1905]: I20250904 01:04:04.953091 1905 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 4 01:04:04.954276 update_engine[1905]: I20250904 01:04:04.953549 1905 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 4 01:04:04.954792 update_engine[1905]: I20250904 01:04:04.954679 1905 omaha_request_params.cc:62] Current group set to beta Sep 4 01:04:04.955027 update_engine[1905]: I20250904 01:04:04.954980 1905 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 4 01:04:04.955027 update_engine[1905]: I20250904 01:04:04.955015 1905 update_attempter.cc:643] Scheduling an action processor start. Sep 4 01:04:04.955221 update_engine[1905]: I20250904 01:04:04.955055 1905 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 4 01:04:04.955221 update_engine[1905]: I20250904 01:04:04.955143 1905 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 4 01:04:04.955385 update_engine[1905]: I20250904 01:04:04.955300 1905 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 4 01:04:04.955385 update_engine[1905]: I20250904 01:04:04.955330 1905 omaha_request_action.cc:272] Request: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: Sep 4 01:04:04.955385 update_engine[1905]: I20250904 01:04:04.955349 1905 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 01:04:04.956394 locksmithd[1957]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 4 01:04:04.957981 update_engine[1905]: I20250904 01:04:04.957942 1905 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 01:04:04.958182 update_engine[1905]: I20250904 01:04:04.958141 1905 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 01:04:04.958573 update_engine[1905]: E20250904 01:04:04.958531 1905 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 01:04:04.958573 update_engine[1905]: I20250904 01:04:04.958566 1905 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 4 01:04:06.232791 containerd[1911]: time="2025-09-04T01:04:06.232769102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"e1526463d36d051bb026143a36f535d6566913bf8334d1d2dc17f6911166bb7b\" pid:7188 exited_at:{seconds:1756947846 nanos:232582093}" Sep 4 01:04:11.195640 containerd[1911]: time="2025-09-04T01:04:11.195608371Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"5d41c7ac10b713e25a2623817696399c256fdbfe809b220b8fb7ebf32a18b4ad\" pid:7240 exited_at:{seconds:1756947851 nanos:195373990}" Sep 4 01:04:14.277536 containerd[1911]: time="2025-09-04T01:04:14.277507320Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"8af38133ca12c47e6aaacb8a6d1ba7c90483020d159485fc9d3ef0573887e240\" pid:7283 exited_at:{seconds:1756947854 nanos:277372898}" Sep 4 01:04:14.862950 update_engine[1905]: I20250904 01:04:14.862793 1905 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 01:04:14.863843 update_engine[1905]: I20250904 01:04:14.863318 1905 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 01:04:14.864074 update_engine[1905]: I20250904 01:04:14.863965 1905 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 01:04:14.864498 update_engine[1905]: E20250904 01:04:14.864405 1905 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 01:04:14.864632 update_engine[1905]: I20250904 01:04:14.864578 1905 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 4 01:04:24.863051 update_engine[1905]: I20250904 01:04:24.862885 1905 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 01:04:24.864072 update_engine[1905]: I20250904 01:04:24.863420 1905 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 01:04:24.864191 update_engine[1905]: I20250904 01:04:24.864151 1905 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 01:04:24.864548 update_engine[1905]: E20250904 01:04:24.864444 1905 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 01:04:24.864726 update_engine[1905]: I20250904 01:04:24.864552 1905 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 4 01:04:27.334929 containerd[1911]: time="2025-09-04T01:04:27.334902081Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"dc1465a646e192cb6a86264d43538c18a28311c44dffe0a3c80db71eefdcf5c3\" pid:7306 exited_at:{seconds:1756947867 nanos:334710840}" Sep 4 01:04:34.861965 update_engine[1905]: I20250904 01:04:34.861804 1905 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 01:04:34.863647 update_engine[1905]: I20250904 01:04:34.862360 1905 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 01:04:34.863647 update_engine[1905]: I20250904 01:04:34.863023 1905 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 01:04:34.863647 update_engine[1905]: E20250904 01:04:34.863472 1905 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 01:04:34.863647 update_engine[1905]: I20250904 01:04:34.863545 1905 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 4 01:04:34.863647 update_engine[1905]: I20250904 01:04:34.863555 1905 omaha_request_action.cc:617] Omaha request response: Sep 4 01:04:34.863647 update_engine[1905]: E20250904 01:04:34.863627 1905 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 4 01:04:34.863647 update_engine[1905]: I20250904 01:04:34.863647 1905 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863654 1905 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863660 1905 update_attempter.cc:306] Processing Done. Sep 4 01:04:34.863925 update_engine[1905]: E20250904 01:04:34.863673 1905 update_attempter.cc:619] Update failed. Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863680 1905 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863685 1905 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863691 1905 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863772 1905 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863801 1905 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863807 1905 omaha_request_action.cc:272] Request: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: Sep 4 01:04:34.863925 update_engine[1905]: I20250904 01:04:34.863814 1905 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 01:04:34.864449 update_engine[1905]: I20250904 01:04:34.863973 1905 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 01:04:34.864449 update_engine[1905]: I20250904 01:04:34.864184 1905 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 01:04:34.864518 locksmithd[1957]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 4 01:04:34.864787 update_engine[1905]: E20250904 01:04:34.864748 1905 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 01:04:34.864833 update_engine[1905]: I20250904 01:04:34.864806 1905 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 4 01:04:34.864833 update_engine[1905]: I20250904 01:04:34.864816 1905 omaha_request_action.cc:617] Omaha request response: Sep 4 01:04:34.864833 update_engine[1905]: I20250904 01:04:34.864823 1905 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 01:04:34.864833 update_engine[1905]: I20250904 01:04:34.864828 1905 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 01:04:34.864957 update_engine[1905]: I20250904 01:04:34.864833 1905 update_attempter.cc:306] Processing Done. Sep 4 01:04:34.864957 update_engine[1905]: I20250904 01:04:34.864840 1905 update_attempter.cc:310] Error event sent. Sep 4 01:04:34.864957 update_engine[1905]: I20250904 01:04:34.864849 1905 update_check_scheduler.cc:74] Next update check in 42m1s Sep 4 01:04:34.865202 locksmithd[1957]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 4 01:04:36.290586 containerd[1911]: time="2025-09-04T01:04:36.290544310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"a74321a331357b8e5be02b0d3a7fe7b7005c76d60e437c4244b1c34dc37e9e23\" pid:7341 exited_at:{seconds:1756947876 nanos:290255830}" Sep 4 01:04:44.276221 containerd[1911]: time="2025-09-04T01:04:44.276189731Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"a7220a8586f6c0071fa9294b42884ec282f9dbb5404e3e1ef3931b8c80e06d89\" pid:7381 exited_at:{seconds:1756947884 nanos:276021591}" Sep 4 01:04:50.578940 containerd[1911]: time="2025-09-04T01:04:50.578916317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"9a65dcf81335841aa0c9da48d46aa3f05dc98112916b6c7b5f0cea3a133211bc\" pid:7404 exited_at:{seconds:1756947890 nanos:578818198}" Sep 4 01:04:57.338248 containerd[1911]: time="2025-09-04T01:04:57.338218498Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"519b7342d0e83f3061d08b07c6023bdbaff60d1ddeadd77e936243ede1b51377\" pid:7426 exited_at:{seconds:1756947897 nanos:338045668}" Sep 4 01:05:01.360800 containerd[1911]: time="2025-09-04T01:05:01.360635122Z" level=warning msg="container event discarded" container=977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893 type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.361696 containerd[1911]: time="2025-09-04T01:05:01.360799155Z" level=warning msg="container event discarded" container=977a46ab8ae3105cece44a30ce6acf50f038779ca82f9ba03bc3355163044893 type=CONTAINER_STARTED_EVENT Sep 4 01:05:01.372313 containerd[1911]: time="2025-09-04T01:05:01.372220048Z" level=warning msg="container event discarded" container=bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.372313 containerd[1911]: time="2025-09-04T01:05:01.372301029Z" level=warning msg="container event discarded" container=bcad22f8efc78abf97ef2d214321b0f03d544b810bf9ac7810cea2d2cfa1d9ad type=CONTAINER_STARTED_EVENT Sep 4 01:05:01.372637 containerd[1911]: time="2025-09-04T01:05:01.372331518Z" level=warning msg="container event discarded" container=124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07 type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.372637 containerd[1911]: time="2025-09-04T01:05:01.372354389Z" level=warning msg="container event discarded" container=124c9a4b4313f2977a1a14671bb041828811dd2b629597e4143081872ffabd07 type=CONTAINER_STARTED_EVENT Sep 4 01:05:01.372637 containerd[1911]: time="2025-09-04T01:05:01.372377261Z" level=warning msg="container event discarded" container=6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261 type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.372637 containerd[1911]: time="2025-09-04T01:05:01.372400994Z" level=warning msg="container event discarded" container=a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.383928 containerd[1911]: time="2025-09-04T01:05:01.383816208Z" level=warning msg="container event discarded" container=2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003 type=CONTAINER_CREATED_EVENT Sep 4 01:05:01.434247 containerd[1911]: time="2025-09-04T01:05:01.434095276Z" level=warning msg="container event discarded" container=6e976a964566dbeaa39462aafa8dd86d662d6958da82dd8f7a06b8522eb8e261 type=CONTAINER_STARTED_EVENT Sep 4 01:05:01.434247 containerd[1911]: time="2025-09-04T01:05:01.434182798Z" level=warning msg="container event discarded" container=2a1ba4ad3f670aafe26812d83f95f759f1836a8c53b161b437de22079be6a003 type=CONTAINER_STARTED_EVENT Sep 4 01:05:01.434247 containerd[1911]: time="2025-09-04T01:05:01.434216640Z" level=warning msg="container event discarded" container=a7af231f031b06da2c72c3751ac9a557c7b060d344d44ef37f6a68d1fb7211ef type=CONTAINER_STARTED_EVENT Sep 4 01:05:06.275389 containerd[1911]: time="2025-09-04T01:05:06.275361024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"4470098eb014345cae306fd176c1c51e7c3a1463e12a13158b0b87768d0df328\" pid:7459 exited_at:{seconds:1756947906 nanos:275123599}" Sep 4 01:05:11.185375 containerd[1911]: time="2025-09-04T01:05:11.185345070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"2e3b3133bb0e6b6f5a93021df782f13a42370aa54cce67cfd37992aedee8d18e\" pid:7497 exited_at:{seconds:1756947911 nanos:185163451}" Sep 4 01:05:12.481747 containerd[1911]: time="2025-09-04T01:05:12.481509958Z" level=warning msg="container event discarded" container=e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99 type=CONTAINER_CREATED_EVENT Sep 4 01:05:12.481747 containerd[1911]: time="2025-09-04T01:05:12.481689564Z" level=warning msg="container event discarded" container=e2303f27b7d013f81451486182e67b845e80c83e6cbfce2f91600062893afd99 type=CONTAINER_STARTED_EVENT Sep 4 01:05:12.481747 containerd[1911]: time="2025-09-04T01:05:12.481726197Z" level=warning msg="container event discarded" container=fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd type=CONTAINER_CREATED_EVENT Sep 4 01:05:12.533346 containerd[1911]: time="2025-09-04T01:05:12.533182292Z" level=warning msg="container event discarded" container=fa89991706e2e3607c51ca350a31c41dd91eff2931ab19d3c1bdeed73c847bdd type=CONTAINER_STARTED_EVENT Sep 4 01:05:12.805609 containerd[1911]: time="2025-09-04T01:05:12.805358595Z" level=warning msg="container event discarded" container=65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f type=CONTAINER_CREATED_EVENT Sep 4 01:05:12.805609 containerd[1911]: time="2025-09-04T01:05:12.805441163Z" level=warning msg="container event discarded" container=65a4845e07f69cced882400b8dc26dd99450f6daa6b4feb6019586c0ccb8b26f type=CONTAINER_STARTED_EVENT Sep 4 01:05:14.316269 containerd[1911]: time="2025-09-04T01:05:14.316235443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"fa7016abb9d9f16f1b9d1159290f27b2025ade4de9cbf08dee34cd4ae1d904fc\" pid:7531 exited_at:{seconds:1756947914 nanos:316104240}" Sep 4 01:05:15.212016 containerd[1911]: time="2025-09-04T01:05:15.211915619Z" level=warning msg="container event discarded" container=ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a type=CONTAINER_CREATED_EVENT Sep 4 01:05:15.290600 containerd[1911]: time="2025-09-04T01:05:15.290480562Z" level=warning msg="container event discarded" container=ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a type=CONTAINER_STARTED_EVENT Sep 4 01:05:17.513908 containerd[1911]: time="2025-09-04T01:05:17.513792678Z" level=warning msg="container event discarded" container=ac3863c29056a777348e0a99a3b6c6562fb017cb4d78d982ea3fb601b036c25a type=CONTAINER_STOPPED_EVENT Sep 4 01:05:18.050797 containerd[1911]: time="2025-09-04T01:05:18.050642408Z" level=warning msg="container event discarded" container=dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670 type=CONTAINER_CREATED_EVENT Sep 4 01:05:18.086158 containerd[1911]: time="2025-09-04T01:05:18.085994389Z" level=warning msg="container event discarded" container=dcf607762aed8f64bab027be0004f4df972c9913f8a95d8b3de731158b594670 type=CONTAINER_STARTED_EVENT Sep 4 01:05:23.567185 containerd[1911]: time="2025-09-04T01:05:23.566977332Z" level=warning msg="container event discarded" container=e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516 type=CONTAINER_CREATED_EVENT Sep 4 01:05:23.567185 containerd[1911]: time="2025-09-04T01:05:23.567124321Z" level=warning msg="container event discarded" container=e09cbb07f8bd99c1bb5f5c370a0c0508e081f905d040b135da933f4c3d0a3516 type=CONTAINER_STARTED_EVENT Sep 4 01:05:23.935024 containerd[1911]: time="2025-09-04T01:05:23.934711957Z" level=warning msg="container event discarded" container=79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563 type=CONTAINER_CREATED_EVENT Sep 4 01:05:23.935024 containerd[1911]: time="2025-09-04T01:05:23.934859516Z" level=warning msg="container event discarded" container=79c3a666fd74a4976859e26bdbcf799786db5d95440f7549e8e567f849176563 type=CONTAINER_STARTED_EVENT Sep 4 01:05:27.378249 containerd[1911]: time="2025-09-04T01:05:27.378185068Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"bb79aab8ac22b531bddf4f3d65378146588b0630343143112a0dfabb074de47b\" pid:7552 exited_at:{seconds:1756947927 nanos:378005858}" Sep 4 01:05:36.249158 containerd[1911]: time="2025-09-04T01:05:36.249098581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"f7e6fad275552b732b61d053e9aa5427f19f2b7fb71b98d4622945e7ad3551ce\" pid:7586 exited_at:{seconds:1756947936 nanos:248871660}" Sep 4 01:05:36.292563 containerd[1911]: time="2025-09-04T01:05:36.292452594Z" level=warning msg="container event discarded" container=7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72 type=CONTAINER_CREATED_EVENT Sep 4 01:05:36.346213 containerd[1911]: time="2025-09-04T01:05:36.346059793Z" level=warning msg="container event discarded" container=7b4d202c262996b462e948d30988874d619e6062b1727886c90027dadfbf5b72 type=CONTAINER_STARTED_EVENT Sep 4 01:05:44.319685 containerd[1911]: time="2025-09-04T01:05:44.319658224Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"a46c1a09f4894510e3f9c5f372474aa9722c515bf20c62a3bc168cff13ccdfd1\" pid:7623 exited_at:{seconds:1756947944 nanos:319517641}" Sep 4 01:05:45.440330 containerd[1911]: time="2025-09-04T01:05:45.440156662Z" level=warning msg="container event discarded" container=421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314 type=CONTAINER_CREATED_EVENT Sep 4 01:05:45.519872 containerd[1911]: time="2025-09-04T01:05:45.519747939Z" level=warning msg="container event discarded" container=421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314 type=CONTAINER_STARTED_EVENT Sep 4 01:05:45.761378 containerd[1911]: time="2025-09-04T01:05:45.761215216Z" level=warning msg="container event discarded" container=421321786f5c895eb8410473cc36a589b3ca65c49644028c0df5981da5bd8314 type=CONTAINER_STOPPED_EVENT Sep 4 01:05:50.614860 containerd[1911]: time="2025-09-04T01:05:50.614830898Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"8e171b768cb351f8f3e5150e9432f046279a05b007a1c05282929f2ec2f85909\" pid:7670 exited_at:{seconds:1756947950 nanos:614692203}" Sep 4 01:05:53.734126 containerd[1911]: time="2025-09-04T01:05:53.733943982Z" level=warning msg="container event discarded" container=66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989 type=CONTAINER_CREATED_EVENT Sep 4 01:05:53.779596 containerd[1911]: time="2025-09-04T01:05:53.779470418Z" level=warning msg="container event discarded" container=66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989 type=CONTAINER_STARTED_EVENT Sep 4 01:05:54.723119 containerd[1911]: time="2025-09-04T01:05:54.722982724Z" level=warning msg="container event discarded" container=66e4fd0bde46a00811ee417728f3c1aec5fcc605a9655eeabd8e23a7b7880989 type=CONTAINER_STOPPED_EVENT Sep 4 01:05:57.333922 containerd[1911]: time="2025-09-04T01:05:57.333868704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"c6bd98d44eca80451b9755bfc164779dd8ca0fdb20218ec9e4b69769626d318d\" pid:7691 exited_at:{seconds:1756947957 nanos:333679118}" Sep 4 01:06:03.309607 containerd[1911]: time="2025-09-04T01:06:03.309494554Z" level=warning msg="container event discarded" container=7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7 type=CONTAINER_CREATED_EVENT Sep 4 01:06:03.389245 containerd[1911]: time="2025-09-04T01:06:03.389143244Z" level=warning msg="container event discarded" container=7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7 type=CONTAINER_STARTED_EVENT Sep 4 01:06:04.674815 containerd[1911]: time="2025-09-04T01:06:04.674645247Z" level=warning msg="container event discarded" container=7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc type=CONTAINER_CREATED_EVENT Sep 4 01:06:04.674815 containerd[1911]: time="2025-09-04T01:06:04.674784222Z" level=warning msg="container event discarded" container=7e7f8b5814b54268aab745943b89ca2077dd44678c6b8df3b5c57cf654527dfc type=CONTAINER_STARTED_EVENT Sep 4 01:06:06.280645 containerd[1911]: time="2025-09-04T01:06:06.280607573Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"bcb0843392648b3720e1480709a9a61d354dac5a507614b1daf29c21d9fc9cf1\" pid:7723 exited_at:{seconds:1756947966 nanos:280205572}" Sep 4 01:06:07.124870 containerd[1911]: time="2025-09-04T01:06:07.124659655Z" level=warning msg="container event discarded" container=6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918 type=CONTAINER_CREATED_EVENT Sep 4 01:06:07.124870 containerd[1911]: time="2025-09-04T01:06:07.124847910Z" level=warning msg="container event discarded" container=6c510568ca0db022decf37528f6de65eb5386b86a895109b782c98dc28029918 type=CONTAINER_STARTED_EVENT Sep 4 01:06:07.124870 containerd[1911]: time="2025-09-04T01:06:07.124878993Z" level=warning msg="container event discarded" container=9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506 type=CONTAINER_CREATED_EVENT Sep 4 01:06:07.172358 containerd[1911]: time="2025-09-04T01:06:07.172220986Z" level=warning msg="container event discarded" container=9204af8cbaf9d04939df682fee4c007a3e343bebb13ec8df0799bd88d76d5506 type=CONTAINER_STARTED_EVENT Sep 4 01:06:07.231891 containerd[1911]: time="2025-09-04T01:06:07.231722714Z" level=warning msg="container event discarded" container=139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd type=CONTAINER_CREATED_EVENT Sep 4 01:06:07.231891 containerd[1911]: time="2025-09-04T01:06:07.231826547Z" level=warning msg="container event discarded" container=139557c3519c11e430fb20e8526baab57bab19440ba9b88a73668d8f81ac41dd type=CONTAINER_STARTED_EVENT Sep 4 01:06:07.231891 containerd[1911]: time="2025-09-04T01:06:07.231860756Z" level=warning msg="container event discarded" container=c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0 type=CONTAINER_CREATED_EVENT Sep 4 01:06:07.306518 containerd[1911]: time="2025-09-04T01:06:07.306338886Z" level=warning msg="container event discarded" container=c90f3eec98236cdcdf66d023616592b087692a62295787d80d96a22d1dce16b0 type=CONTAINER_STARTED_EVENT Sep 4 01:06:08.131209 containerd[1911]: time="2025-09-04T01:06:08.131047378Z" level=warning msg="container event discarded" container=34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00 type=CONTAINER_CREATED_EVENT Sep 4 01:06:08.131209 containerd[1911]: time="2025-09-04T01:06:08.131145852Z" level=warning msg="container event discarded" container=34cc4dd166a166f2336d1a721623b89c8a3a8bbf0ea0f491a1af903244b41c00 type=CONTAINER_STARTED_EVENT Sep 4 01:06:08.226948 containerd[1911]: time="2025-09-04T01:06:08.226790599Z" level=warning msg="container event discarded" container=4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0 type=CONTAINER_CREATED_EVENT Sep 4 01:06:08.226948 containerd[1911]: time="2025-09-04T01:06:08.226886581Z" level=warning msg="container event discarded" container=4d88d06ef644ebf187a2b16a21c981b7508877efddce640799c966b6b0660ec0 type=CONTAINER_STARTED_EVENT Sep 4 01:06:08.663213 containerd[1911]: time="2025-09-04T01:06:08.663047685Z" level=warning msg="container event discarded" container=5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3 type=CONTAINER_CREATED_EVENT Sep 4 01:06:08.706670 containerd[1911]: time="2025-09-04T01:06:08.706524448Z" level=warning msg="container event discarded" container=5074d850a38049d0aaa69fa37f4fd1cf7342251ceebfc1103a75c57f8caa52c3 type=CONTAINER_STARTED_EVENT Sep 4 01:06:10.120737 containerd[1911]: time="2025-09-04T01:06:10.120588133Z" level=warning msg="container event discarded" container=eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e type=CONTAINER_CREATED_EVENT Sep 4 01:06:10.120737 containerd[1911]: time="2025-09-04T01:06:10.120676592Z" level=warning msg="container event discarded" container=eee599b7c86e7cf87fcaa23d5d28c472b7b150ac6edd816b6fd931f21a40689e type=CONTAINER_STARTED_EVENT Sep 4 01:06:10.294000 containerd[1911]: time="2025-09-04T01:06:10.293846409Z" level=warning msg="container event discarded" container=113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201 type=CONTAINER_CREATED_EVENT Sep 4 01:06:10.294000 containerd[1911]: time="2025-09-04T01:06:10.293935482Z" level=warning msg="container event discarded" container=113a7ab8f635c92be0b373a3535b9d01a52f73b47e91d63ae4731da48d822201 type=CONTAINER_STARTED_EVENT Sep 4 01:06:10.294000 containerd[1911]: time="2025-09-04T01:06:10.293961845Z" level=warning msg="container event discarded" container=4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181 type=CONTAINER_CREATED_EVENT Sep 4 01:06:10.294000 containerd[1911]: time="2025-09-04T01:06:10.293983683Z" level=warning msg="container event discarded" container=4d8b8c05ab3239de4aa5b8b1add330b4a4a630d6883375ead2a84c0f72506181 type=CONTAINER_STARTED_EVENT Sep 4 01:06:11.227025 containerd[1911]: time="2025-09-04T01:06:11.226982433Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"3bdeb402444c7899ec39f34cf0e8d686f930e1b0a7eb6d78e6cc52740a16e2d9\" pid:7759 exited_at:{seconds:1756947971 nanos:226811784}" Sep 4 01:06:13.548314 containerd[1911]: time="2025-09-04T01:06:13.548150040Z" level=warning msg="container event discarded" container=f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb type=CONTAINER_CREATED_EVENT Sep 4 01:06:13.593710 containerd[1911]: time="2025-09-04T01:06:13.593541297Z" level=warning msg="container event discarded" container=f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb type=CONTAINER_STARTED_EVENT Sep 4 01:06:14.326302 containerd[1911]: time="2025-09-04T01:06:14.326251827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"0c1e62a0dcdaccfd315fbe173071c6eb3570543e6fe568de5bb55ff618ddb6eb\" pid:7791 exited_at:{seconds:1756947974 nanos:326138304}" Sep 4 01:06:19.060239 containerd[1911]: time="2025-09-04T01:06:19.060118700Z" level=warning msg="container event discarded" container=2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09 type=CONTAINER_CREATED_EVENT Sep 4 01:06:19.107945 containerd[1911]: time="2025-09-04T01:06:19.107739372Z" level=warning msg="container event discarded" container=2754b2a6c524f5c576c0a1645cd6b318d8e4051c3746adc4656a7ae5d5d2fa09 type=CONTAINER_STARTED_EVENT Sep 4 01:06:23.044636 containerd[1911]: time="2025-09-04T01:06:23.044437578Z" level=warning msg="container event discarded" container=3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544 type=CONTAINER_CREATED_EVENT Sep 4 01:06:23.097050 containerd[1911]: time="2025-09-04T01:06:23.096892231Z" level=warning msg="container event discarded" container=3d037b9d179a481ebd7198a1c651decf9f24760cf58c25cb9a9881d7fb3be544 type=CONTAINER_STARTED_EVENT Sep 4 01:06:25.533730 containerd[1911]: time="2025-09-04T01:06:25.533556983Z" level=warning msg="container event discarded" container=12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c type=CONTAINER_CREATED_EVENT Sep 4 01:06:25.582169 containerd[1911]: time="2025-09-04T01:06:25.581985620Z" level=warning msg="container event discarded" container=12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c type=CONTAINER_STARTED_EVENT Sep 4 01:06:27.008864 containerd[1911]: time="2025-09-04T01:06:27.008839220Z" level=warning msg="container event discarded" container=609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772 type=CONTAINER_CREATED_EVENT Sep 4 01:06:27.050249 containerd[1911]: time="2025-09-04T01:06:27.050114643Z" level=warning msg="container event discarded" container=609d52b757901a2415f83a4a69224c50b3da7a290446ad275f4637ff34335772 type=CONTAINER_STARTED_EVENT Sep 4 01:06:27.388226 containerd[1911]: time="2025-09-04T01:06:27.388156502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"ce1e3b8d460a4b1c3f16022ca5b6f81b8bc133c607af11194356e059f70c05ca\" pid:7818 exited_at:{seconds:1756947987 nanos:387955744}" Sep 4 01:06:27.575691 containerd[1911]: time="2025-09-04T01:06:27.575541430Z" level=warning msg="container event discarded" container=845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab type=CONTAINER_CREATED_EVENT Sep 4 01:06:27.686557 containerd[1911]: time="2025-09-04T01:06:27.686262970Z" level=warning msg="container event discarded" container=845a69cc2c3463164a8e4fb439d99a512439c8073c6520de84c8370a71f8f4ab type=CONTAINER_STARTED_EVENT Sep 4 01:06:29.655721 containerd[1911]: time="2025-09-04T01:06:29.655618154Z" level=warning msg="container event discarded" container=71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6 type=CONTAINER_CREATED_EVENT Sep 4 01:06:29.695249 containerd[1911]: time="2025-09-04T01:06:29.695079541Z" level=warning msg="container event discarded" container=71fa559c8b06e730583fe0963376411b1fa780aaf54d25cdd7c16e84f355bca6 type=CONTAINER_STARTED_EVENT Sep 4 01:06:35.763341 systemd[1]: Started sshd@9-147.75.202.229:22-147.75.109.163:49704.service - OpenSSH per-connection server daemon (147.75.109.163:49704). Sep 4 01:06:35.862330 sshd[7846]: Accepted publickey for core from 147.75.109.163 port 49704 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:35.863582 sshd-session[7846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:35.868196 systemd-logind[1900]: New session 12 of user core. Sep 4 01:06:35.882204 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 01:06:36.040968 sshd[7848]: Connection closed by 147.75.109.163 port 49704 Sep 4 01:06:36.041099 sshd-session[7846]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:36.042877 systemd[1]: sshd@9-147.75.202.229:22-147.75.109.163:49704.service: Deactivated successfully. Sep 4 01:06:36.043867 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 01:06:36.044622 systemd-logind[1900]: Session 12 logged out. Waiting for processes to exit. Sep 4 01:06:36.045262 systemd-logind[1900]: Removed session 12. Sep 4 01:06:36.281693 containerd[1911]: time="2025-09-04T01:06:36.281644553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"cd5c07df900d176d9dbd70ca5a4ebc4d3de1201db9b0be268a32f40d34bdd9a5\" pid:7892 exited_at:{seconds:1756947996 nanos:281444273}" Sep 4 01:06:41.056631 systemd[1]: Started sshd@10-147.75.202.229:22-147.75.109.163:53312.service - OpenSSH per-connection server daemon (147.75.109.163:53312). Sep 4 01:06:41.088413 sshd[7917]: Accepted publickey for core from 147.75.109.163 port 53312 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:41.089154 sshd-session[7917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:41.092061 systemd-logind[1900]: New session 13 of user core. Sep 4 01:06:41.108916 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 01:06:41.226382 sshd[7919]: Connection closed by 147.75.109.163 port 53312 Sep 4 01:06:41.226557 sshd-session[7917]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:41.228501 systemd[1]: sshd@10-147.75.202.229:22-147.75.109.163:53312.service: Deactivated successfully. Sep 4 01:06:41.229430 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 01:06:41.229927 systemd-logind[1900]: Session 13 logged out. Waiting for processes to exit. Sep 4 01:06:41.230662 systemd-logind[1900]: Removed session 13. Sep 4 01:06:44.325643 containerd[1911]: time="2025-09-04T01:06:44.325609989Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"fabc8d55e7f676d933e8f45212f043d4bf01487997785d5a0dbc0edaf17df1c9\" pid:7959 exited_at:{seconds:1756948004 nanos:325416191}" Sep 4 01:06:46.255371 systemd[1]: Started sshd@11-147.75.202.229:22-147.75.109.163:53322.service - OpenSSH per-connection server daemon (147.75.109.163:53322). Sep 4 01:06:46.338100 sshd[7970]: Accepted publickey for core from 147.75.109.163 port 53322 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:46.338987 sshd-session[7970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:46.342525 systemd-logind[1900]: New session 14 of user core. Sep 4 01:06:46.356213 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 01:06:46.494832 sshd[7972]: Connection closed by 147.75.109.163 port 53322 Sep 4 01:06:46.494990 sshd-session[7970]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:46.512958 systemd[1]: sshd@11-147.75.202.229:22-147.75.109.163:53322.service: Deactivated successfully. Sep 4 01:06:46.513779 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 01:06:46.514314 systemd-logind[1900]: Session 14 logged out. Waiting for processes to exit. Sep 4 01:06:46.515544 systemd[1]: Started sshd@12-147.75.202.229:22-147.75.109.163:53328.service - OpenSSH per-connection server daemon (147.75.109.163:53328). Sep 4 01:06:46.515879 systemd-logind[1900]: Removed session 14. Sep 4 01:06:46.547850 sshd[7998]: Accepted publickey for core from 147.75.109.163 port 53328 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:46.551231 sshd-session[7998]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:46.564117 systemd-logind[1900]: New session 15 of user core. Sep 4 01:06:46.577118 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 01:06:46.746841 sshd[8001]: Connection closed by 147.75.109.163 port 53328 Sep 4 01:06:46.746978 sshd-session[7998]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:46.775132 systemd[1]: sshd@12-147.75.202.229:22-147.75.109.163:53328.service: Deactivated successfully. Sep 4 01:06:46.780027 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 01:06:46.782345 systemd-logind[1900]: Session 15 logged out. Waiting for processes to exit. Sep 4 01:06:46.789106 systemd[1]: Started sshd@13-147.75.202.229:22-147.75.109.163:53330.service - OpenSSH per-connection server daemon (147.75.109.163:53330). Sep 4 01:06:46.791091 systemd-logind[1900]: Removed session 15. Sep 4 01:06:46.873052 sshd[8024]: Accepted publickey for core from 147.75.109.163 port 53330 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:46.876429 sshd-session[8024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:46.889165 systemd-logind[1900]: New session 16 of user core. Sep 4 01:06:46.911026 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 01:06:47.055036 sshd[8027]: Connection closed by 147.75.109.163 port 53330 Sep 4 01:06:47.055169 sshd-session[8024]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:47.056979 systemd[1]: sshd@13-147.75.202.229:22-147.75.109.163:53330.service: Deactivated successfully. Sep 4 01:06:47.058003 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 01:06:47.058735 systemd-logind[1900]: Session 16 logged out. Waiting for processes to exit. Sep 4 01:06:47.059526 systemd-logind[1900]: Removed session 16. Sep 4 01:06:50.572310 containerd[1911]: time="2025-09-04T01:06:50.572286347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"b7bfce32d3f94c368f7694a594d396df1a49acc73abcb379af99ad5d9f8f606f\" pid:8063 exited_at:{seconds:1756948010 nanos:572197055}" Sep 4 01:06:52.070119 systemd[1]: Started sshd@14-147.75.202.229:22-147.75.109.163:57150.service - OpenSSH per-connection server daemon (147.75.109.163:57150). Sep 4 01:06:52.120995 sshd[8078]: Accepted publickey for core from 147.75.109.163 port 57150 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:52.124259 sshd-session[8078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:52.132892 systemd-logind[1900]: New session 17 of user core. Sep 4 01:06:52.140886 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 01:06:52.224683 sshd[8080]: Connection closed by 147.75.109.163 port 57150 Sep 4 01:06:52.224823 sshd-session[8078]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:52.226572 systemd[1]: sshd@14-147.75.202.229:22-147.75.109.163:57150.service: Deactivated successfully. Sep 4 01:06:52.227527 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 01:06:52.228228 systemd-logind[1900]: Session 17 logged out. Waiting for processes to exit. Sep 4 01:06:52.228733 systemd-logind[1900]: Removed session 17. Sep 4 01:06:57.250845 systemd[1]: Started sshd@15-147.75.202.229:22-147.75.109.163:57154.service - OpenSSH per-connection server daemon (147.75.109.163:57154). Sep 4 01:06:57.295115 sshd[8105]: Accepted publickey for core from 147.75.109.163 port 57154 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:06:57.295855 sshd-session[8105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:06:57.298313 systemd-logind[1900]: New session 18 of user core. Sep 4 01:06:57.298905 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 01:06:57.320663 containerd[1911]: time="2025-09-04T01:06:57.320639662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"4c314eb72d18b284f87f76ba677735fc36e493c8d30ffdc9fd5926ad27f7d802\" pid:8119 exited_at:{seconds:1756948017 nanos:320459481}" Sep 4 01:06:57.386636 sshd[8125]: Connection closed by 147.75.109.163 port 57154 Sep 4 01:06:57.386821 sshd-session[8105]: pam_unix(sshd:session): session closed for user core Sep 4 01:06:57.389026 systemd[1]: sshd@15-147.75.202.229:22-147.75.109.163:57154.service: Deactivated successfully. Sep 4 01:06:57.390030 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 01:06:57.390501 systemd-logind[1900]: Session 18 logged out. Waiting for processes to exit. Sep 4 01:06:57.391275 systemd-logind[1900]: Removed session 18. Sep 4 01:07:02.412729 systemd[1]: Started sshd@16-147.75.202.229:22-147.75.109.163:37190.service - OpenSSH per-connection server daemon (147.75.109.163:37190). Sep 4 01:07:02.460809 sshd[8165]: Accepted publickey for core from 147.75.109.163 port 37190 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:02.461565 sshd-session[8165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:02.464762 systemd-logind[1900]: New session 19 of user core. Sep 4 01:07:02.483207 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 01:07:02.581817 sshd[8167]: Connection closed by 147.75.109.163 port 37190 Sep 4 01:07:02.581989 sshd-session[8165]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:02.583879 systemd[1]: sshd@16-147.75.202.229:22-147.75.109.163:37190.service: Deactivated successfully. Sep 4 01:07:02.584923 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 01:07:02.585710 systemd-logind[1900]: Session 19 logged out. Waiting for processes to exit. Sep 4 01:07:02.586510 systemd-logind[1900]: Removed session 19. Sep 4 01:07:06.227276 containerd[1911]: time="2025-09-04T01:07:06.227250935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7eff2257ac54fad8dc3ee3ababd5831fb6fe92201de86434207e6a3e852f3db7\" id:\"102ab6c839d49313676f07de62d27d00c08ee074023e4b97f60644ce62098263\" pid:8205 exited_at:{seconds:1756948026 nanos:227083472}" Sep 4 01:07:07.597872 systemd[1]: Started sshd@17-147.75.202.229:22-147.75.109.163:37202.service - OpenSSH per-connection server daemon (147.75.109.163:37202). Sep 4 01:07:07.652182 sshd[8230]: Accepted publickey for core from 147.75.109.163 port 37202 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:07.653462 sshd-session[8230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:07.658266 systemd-logind[1900]: New session 20 of user core. Sep 4 01:07:07.669959 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 01:07:07.797233 sshd[8232]: Connection closed by 147.75.109.163 port 37202 Sep 4 01:07:07.797437 sshd-session[8230]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:07.814920 systemd[1]: sshd@17-147.75.202.229:22-147.75.109.163:37202.service: Deactivated successfully. Sep 4 01:07:07.815779 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 01:07:07.816292 systemd-logind[1900]: Session 20 logged out. Waiting for processes to exit. Sep 4 01:07:07.817421 systemd[1]: Started sshd@18-147.75.202.229:22-147.75.109.163:37214.service - OpenSSH per-connection server daemon (147.75.109.163:37214). Sep 4 01:07:07.818176 systemd-logind[1900]: Removed session 20. Sep 4 01:07:07.848914 sshd[8257]: Accepted publickey for core from 147.75.109.163 port 37214 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:07.849737 sshd-session[8257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:07.852703 systemd-logind[1900]: New session 21 of user core. Sep 4 01:07:07.872238 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 01:07:08.036520 sshd[8261]: Connection closed by 147.75.109.163 port 37214 Sep 4 01:07:08.036663 sshd-session[8257]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:08.055718 systemd[1]: sshd@18-147.75.202.229:22-147.75.109.163:37214.service: Deactivated successfully. Sep 4 01:07:08.056884 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 01:07:08.057448 systemd-logind[1900]: Session 21 logged out. Waiting for processes to exit. Sep 4 01:07:08.059231 systemd[1]: Started sshd@19-147.75.202.229:22-147.75.109.163:37228.service - OpenSSH per-connection server daemon (147.75.109.163:37228). Sep 4 01:07:08.059760 systemd-logind[1900]: Removed session 21. Sep 4 01:07:08.102130 sshd[8281]: Accepted publickey for core from 147.75.109.163 port 37228 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:08.103233 sshd-session[8281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:08.107633 systemd-logind[1900]: New session 22 of user core. Sep 4 01:07:08.116898 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 01:07:08.829534 sshd[8283]: Connection closed by 147.75.109.163 port 37228 Sep 4 01:07:08.829696 sshd-session[8281]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:08.840070 systemd[1]: sshd@19-147.75.202.229:22-147.75.109.163:37228.service: Deactivated successfully. Sep 4 01:07:08.840969 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 01:07:08.841431 systemd-logind[1900]: Session 22 logged out. Waiting for processes to exit. Sep 4 01:07:08.842683 systemd[1]: Started sshd@20-147.75.202.229:22-147.75.109.163:37240.service - OpenSSH per-connection server daemon (147.75.109.163:37240). Sep 4 01:07:08.843026 systemd-logind[1900]: Removed session 22. Sep 4 01:07:08.876764 sshd[8312]: Accepted publickey for core from 147.75.109.163 port 37240 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:08.878242 sshd-session[8312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:08.882985 systemd-logind[1900]: New session 23 of user core. Sep 4 01:07:08.897959 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 01:07:09.029168 sshd[8316]: Connection closed by 147.75.109.163 port 37240 Sep 4 01:07:09.029361 sshd-session[8312]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:09.044894 systemd[1]: sshd@20-147.75.202.229:22-147.75.109.163:37240.service: Deactivated successfully. Sep 4 01:07:09.045779 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 01:07:09.046245 systemd-logind[1900]: Session 23 logged out. Waiting for processes to exit. Sep 4 01:07:09.047482 systemd[1]: Started sshd@21-147.75.202.229:22-147.75.109.163:37254.service - OpenSSH per-connection server daemon (147.75.109.163:37254). Sep 4 01:07:09.048068 systemd-logind[1900]: Removed session 23. Sep 4 01:07:09.079422 sshd[8339]: Accepted publickey for core from 147.75.109.163 port 37254 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:09.080165 sshd-session[8339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:09.082777 systemd-logind[1900]: New session 24 of user core. Sep 4 01:07:09.096947 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 01:07:09.227972 sshd[8341]: Connection closed by 147.75.109.163 port 37254 Sep 4 01:07:09.228144 sshd-session[8339]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:09.229664 systemd[1]: sshd@21-147.75.202.229:22-147.75.109.163:37254.service: Deactivated successfully. Sep 4 01:07:09.230622 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 01:07:09.231267 systemd-logind[1900]: Session 24 logged out. Waiting for processes to exit. Sep 4 01:07:09.231758 systemd-logind[1900]: Removed session 24. Sep 4 01:07:11.225760 containerd[1911]: time="2025-09-04T01:07:11.225726827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"12534d187fb6085e7c4f5c9f177669d06db7d91bef2f3d9948fda2d4814b341c\" id:\"a29e59407414be7e48fc614793f82f358756f182cf516f45aa4369277a54c0c0\" pid:8376 exited_at:{seconds:1756948031 nanos:225534313}" Sep 4 01:07:14.245174 systemd[1]: Started sshd@22-147.75.202.229:22-147.75.109.163:35920.service - OpenSSH per-connection server daemon (147.75.109.163:35920). Sep 4 01:07:14.267705 containerd[1911]: time="2025-09-04T01:07:14.267652986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0d452b31a36222b83d38f74cbe80a701ff14510544cdabc6ab228596b94daeb\" id:\"f72859ccc7d4cdc11ec2286bc488c04131fa85708aba6e3cfa193f81e486db19\" pid:8416 exited_at:{seconds:1756948034 nanos:267546713}" Sep 4 01:07:14.284285 sshd[8422]: Accepted publickey for core from 147.75.109.163 port 35920 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:14.285161 sshd-session[8422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:14.288471 systemd-logind[1900]: New session 25 of user core. Sep 4 01:07:14.308917 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 01:07:14.450385 sshd[8428]: Connection closed by 147.75.109.163 port 35920 Sep 4 01:07:14.450595 sshd-session[8422]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:14.452894 systemd[1]: sshd@22-147.75.202.229:22-147.75.109.163:35920.service: Deactivated successfully. Sep 4 01:07:14.454141 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 01:07:14.455114 systemd-logind[1900]: Session 25 logged out. Waiting for processes to exit. Sep 4 01:07:14.455997 systemd-logind[1900]: Removed session 25. Sep 4 01:07:19.468696 systemd[1]: Started sshd@23-147.75.202.229:22-147.75.109.163:35926.service - OpenSSH per-connection server daemon (147.75.109.163:35926). Sep 4 01:07:19.499885 sshd[8460]: Accepted publickey for core from 147.75.109.163 port 35926 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:07:19.500567 sshd-session[8460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:07:19.503508 systemd-logind[1900]: New session 26 of user core. Sep 4 01:07:19.523051 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 01:07:19.652310 sshd[8462]: Connection closed by 147.75.109.163 port 35926 Sep 4 01:07:19.652492 sshd-session[8460]: pam_unix(sshd:session): session closed for user core Sep 4 01:07:19.654170 systemd[1]: sshd@23-147.75.202.229:22-147.75.109.163:35926.service: Deactivated successfully. Sep 4 01:07:19.655120 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 01:07:19.655745 systemd-logind[1900]: Session 26 logged out. Waiting for processes to exit. Sep 4 01:07:19.656408 systemd-logind[1900]: Removed session 26.