Sep 4 00:52:28.913816 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Sep 3 22:05:39 -00 2025 Sep 4 00:52:28.913831 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:52:28.913839 kernel: BIOS-provided physical RAM map: Sep 4 00:52:28.913843 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Sep 4 00:52:28.913847 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Sep 4 00:52:28.913851 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 4 00:52:28.913856 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 4 00:52:28.913860 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 4 00:52:28.913864 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081b17fff] usable Sep 4 00:52:28.913869 kernel: BIOS-e820: [mem 0x0000000081b18000-0x0000000081b18fff] ACPI NVS Sep 4 00:52:28.913874 kernel: BIOS-e820: [mem 0x0000000081b19000-0x0000000081b19fff] reserved Sep 4 00:52:28.913878 kernel: BIOS-e820: [mem 0x0000000081b1a000-0x000000008afc4fff] usable Sep 4 00:52:28.913882 kernel: BIOS-e820: [mem 0x000000008afc5000-0x000000008c0a9fff] reserved Sep 4 00:52:28.913886 kernel: BIOS-e820: [mem 0x000000008c0aa000-0x000000008c232fff] usable Sep 4 00:52:28.913892 kernel: BIOS-e820: [mem 0x000000008c233000-0x000000008c664fff] ACPI NVS Sep 4 00:52:28.913897 kernel: BIOS-e820: [mem 0x000000008c665000-0x000000008eefefff] reserved Sep 4 00:52:28.913902 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Sep 4 00:52:28.913907 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Sep 4 00:52:28.913912 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 4 00:52:28.913916 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 4 00:52:28.913921 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 4 00:52:28.913926 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 4 00:52:28.913931 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 4 00:52:28.913935 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Sep 4 00:52:28.913940 kernel: NX (Execute Disable) protection: active Sep 4 00:52:28.913946 kernel: APIC: Static calls initialized Sep 4 00:52:28.913950 kernel: SMBIOS 3.2.1 present. Sep 4 00:52:28.913955 kernel: DMI: Supermicro SYS-5019C-MR-PH004/X11SCM-F, BIOS 1.9 09/16/2022 Sep 4 00:52:28.913960 kernel: DMI: Memory slots populated: 2/4 Sep 4 00:52:28.913965 kernel: tsc: Detected 3400.000 MHz processor Sep 4 00:52:28.913970 kernel: tsc: Detected 3399.906 MHz TSC Sep 4 00:52:28.913975 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 00:52:28.913980 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 00:52:28.913985 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Sep 4 00:52:28.913990 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 4 00:52:28.913995 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 00:52:28.914000 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Sep 4 00:52:28.914005 kernel: Using GB pages for direct mapping Sep 4 00:52:28.914010 kernel: ACPI: Early table checksum verification disabled Sep 4 00:52:28.914015 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 4 00:52:28.914022 kernel: ACPI: XSDT 0x000000008C5460C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 4 00:52:28.914027 kernel: ACPI: FACP 0x000000008C582670 000114 (v06 01072009 AMI 00010013) Sep 4 00:52:28.914033 kernel: ACPI: DSDT 0x000000008C546268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 4 00:52:28.914038 kernel: ACPI: FACS 0x000000008C664F80 000040 Sep 4 00:52:28.914043 kernel: ACPI: APIC 0x000000008C582788 00012C (v04 01072009 AMI 00010013) Sep 4 00:52:28.914048 kernel: ACPI: FPDT 0x000000008C5828B8 000044 (v01 01072009 AMI 00010013) Sep 4 00:52:28.914053 kernel: ACPI: FIDT 0x000000008C582900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 4 00:52:28.914058 kernel: ACPI: MCFG 0x000000008C5829A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 4 00:52:28.914063 kernel: ACPI: SPMI 0x000000008C5829E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 4 00:52:28.914068 kernel: ACPI: SSDT 0x000000008C582A28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 4 00:52:28.914074 kernel: ACPI: SSDT 0x000000008C584548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 4 00:52:28.914080 kernel: ACPI: SSDT 0x000000008C587710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 4 00:52:28.914085 kernel: ACPI: HPET 0x000000008C589A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:52:28.914090 kernel: ACPI: SSDT 0x000000008C589A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 4 00:52:28.914095 kernel: ACPI: SSDT 0x000000008C58AA28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 4 00:52:28.914100 kernel: ACPI: UEFI 0x000000008C58B320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:52:28.914105 kernel: ACPI: LPIT 0x000000008C58B368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:52:28.914114 kernel: ACPI: SSDT 0x000000008C58B400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 4 00:52:28.914120 kernel: ACPI: SSDT 0x000000008C58DBE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 4 00:52:28.914147 kernel: ACPI: DBGP 0x000000008C58F0C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:52:28.914168 kernel: ACPI: DBG2 0x000000008C58F100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 4 00:52:28.914173 kernel: ACPI: SSDT 0x000000008C58F158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 4 00:52:28.914194 kernel: ACPI: DMAR 0x000000008C590CC0 000070 (v01 INTEL EDK2 00000002 01000013) Sep 4 00:52:28.914199 kernel: ACPI: SSDT 0x000000008C590D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 4 00:52:28.914204 kernel: ACPI: TPM2 0x000000008C590E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 4 00:52:28.914209 kernel: ACPI: SSDT 0x000000008C590EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 4 00:52:28.914214 kernel: ACPI: WSMT 0x000000008C591C40 000028 (v01 SUPERM 01072009 AMI 00010013) Sep 4 00:52:28.914220 kernel: ACPI: EINJ 0x000000008C591C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 4 00:52:28.914225 kernel: ACPI: ERST 0x000000008C591D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 4 00:52:28.914230 kernel: ACPI: BERT 0x000000008C591FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 4 00:52:28.914235 kernel: ACPI: HEST 0x000000008C591FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 4 00:52:28.914240 kernel: ACPI: SSDT 0x000000008C592278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 4 00:52:28.914245 kernel: ACPI: Reserving FACP table memory at [mem 0x8c582670-0x8c582783] Sep 4 00:52:28.914250 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c546268-0x8c58266b] Sep 4 00:52:28.914255 kernel: ACPI: Reserving FACS table memory at [mem 0x8c664f80-0x8c664fbf] Sep 4 00:52:28.914261 kernel: ACPI: Reserving APIC table memory at [mem 0x8c582788-0x8c5828b3] Sep 4 00:52:28.914266 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c5828b8-0x8c5828fb] Sep 4 00:52:28.914271 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c582900-0x8c58299b] Sep 4 00:52:28.914276 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c5829a0-0x8c5829db] Sep 4 00:52:28.914281 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c5829e0-0x8c582a20] Sep 4 00:52:28.914286 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c582a28-0x8c584543] Sep 4 00:52:28.914291 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c584548-0x8c58770d] Sep 4 00:52:28.914296 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c587710-0x8c589a3a] Sep 4 00:52:28.914301 kernel: ACPI: Reserving HPET table memory at [mem 0x8c589a40-0x8c589a77] Sep 4 00:52:28.914306 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c589a78-0x8c58aa25] Sep 4 00:52:28.914311 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58b31b] Sep 4 00:52:28.914317 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c58b320-0x8c58b361] Sep 4 00:52:28.914322 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c58b368-0x8c58b3fb] Sep 4 00:52:28.914326 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58b400-0x8c58dbdd] Sep 4 00:52:28.914331 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58dbe0-0x8c58f0c1] Sep 4 00:52:28.914336 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c58f0c8-0x8c58f0fb] Sep 4 00:52:28.914341 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c58f100-0x8c58f153] Sep 4 00:52:28.914346 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f158-0x8c590cbe] Sep 4 00:52:28.914351 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c590cc0-0x8c590d2f] Sep 4 00:52:28.914357 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c590d30-0x8c590e73] Sep 4 00:52:28.914362 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c590e78-0x8c590eab] Sep 4 00:52:28.914367 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c590eb0-0x8c591c3e] Sep 4 00:52:28.914372 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c591c40-0x8c591c67] Sep 4 00:52:28.914377 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c591c68-0x8c591d97] Sep 4 00:52:28.914382 kernel: ACPI: Reserving ERST table memory at [mem 0x8c591d98-0x8c591fc7] Sep 4 00:52:28.914387 kernel: ACPI: Reserving BERT table memory at [mem 0x8c591fc8-0x8c591ff7] Sep 4 00:52:28.914392 kernel: ACPI: Reserving HEST table memory at [mem 0x8c591ff8-0x8c592273] Sep 4 00:52:28.914397 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592278-0x8c5923d9] Sep 4 00:52:28.914402 kernel: No NUMA configuration found Sep 4 00:52:28.914408 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Sep 4 00:52:28.914412 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Sep 4 00:52:28.914418 kernel: Zone ranges: Sep 4 00:52:28.914423 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 00:52:28.914427 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 4 00:52:28.914432 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Sep 4 00:52:28.914437 kernel: Device empty Sep 4 00:52:28.914442 kernel: Movable zone start for each node Sep 4 00:52:28.914447 kernel: Early memory node ranges Sep 4 00:52:28.914453 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Sep 4 00:52:28.914458 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 4 00:52:28.914463 kernel: node 0: [mem 0x0000000040400000-0x0000000081b17fff] Sep 4 00:52:28.914468 kernel: node 0: [mem 0x0000000081b1a000-0x000000008afc4fff] Sep 4 00:52:28.914473 kernel: node 0: [mem 0x000000008c0aa000-0x000000008c232fff] Sep 4 00:52:28.914481 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Sep 4 00:52:28.914487 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Sep 4 00:52:28.914493 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Sep 4 00:52:28.914498 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 00:52:28.914504 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Sep 4 00:52:28.914510 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 4 00:52:28.914515 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 4 00:52:28.914520 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Sep 4 00:52:28.914525 kernel: On node 0, zone DMA32: 11468 pages in unavailable ranges Sep 4 00:52:28.914531 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Sep 4 00:52:28.914536 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Sep 4 00:52:28.914541 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 4 00:52:28.914547 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 4 00:52:28.914553 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 4 00:52:28.914558 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 4 00:52:28.914563 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 4 00:52:28.914568 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 4 00:52:28.914574 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 4 00:52:28.914579 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 4 00:52:28.914584 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 4 00:52:28.914589 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 4 00:52:28.914596 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 4 00:52:28.914601 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 4 00:52:28.914606 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 4 00:52:28.914612 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 4 00:52:28.914617 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 4 00:52:28.914622 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 4 00:52:28.914627 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 4 00:52:28.914633 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 4 00:52:28.914638 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 00:52:28.914644 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 00:52:28.914650 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 00:52:28.914655 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 4 00:52:28.914660 kernel: TSC deadline timer available Sep 4 00:52:28.914666 kernel: CPU topo: Max. logical packages: 1 Sep 4 00:52:28.914671 kernel: CPU topo: Max. logical dies: 1 Sep 4 00:52:28.914676 kernel: CPU topo: Max. dies per package: 1 Sep 4 00:52:28.914681 kernel: CPU topo: Max. threads per core: 2 Sep 4 00:52:28.914687 kernel: CPU topo: Num. cores per package: 8 Sep 4 00:52:28.914692 kernel: CPU topo: Num. threads per package: 16 Sep 4 00:52:28.914698 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 4 00:52:28.914704 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Sep 4 00:52:28.914709 kernel: Booting paravirtualized kernel on bare hardware Sep 4 00:52:28.914714 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 00:52:28.914720 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 4 00:52:28.914725 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 4 00:52:28.914730 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 4 00:52:28.914736 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 4 00:52:28.914742 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:52:28.914748 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 00:52:28.914754 kernel: random: crng init done Sep 4 00:52:28.914759 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 4 00:52:28.914764 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 4 00:52:28.914770 kernel: Fallback order for Node 0: 0 Sep 4 00:52:28.914775 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363237 Sep 4 00:52:28.914780 kernel: Policy zone: Normal Sep 4 00:52:28.914785 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 00:52:28.914792 kernel: software IO TLB: area num 16. Sep 4 00:52:28.914797 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 4 00:52:28.914802 kernel: ftrace: allocating 40099 entries in 157 pages Sep 4 00:52:28.914808 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 00:52:28.914813 kernel: Dynamic Preempt: voluntary Sep 4 00:52:28.914818 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 00:52:28.914824 kernel: rcu: RCU event tracing is enabled. Sep 4 00:52:28.914830 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 4 00:52:28.914835 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 00:52:28.914841 kernel: Rude variant of Tasks RCU enabled. Sep 4 00:52:28.914847 kernel: Tracing variant of Tasks RCU enabled. Sep 4 00:52:28.914852 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 00:52:28.914857 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 4 00:52:28.914863 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:52:28.914868 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:52:28.914874 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 4 00:52:28.914879 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 4 00:52:28.914884 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 00:52:28.914890 kernel: Console: colour VGA+ 80x25 Sep 4 00:52:28.914896 kernel: printk: legacy console [tty0] enabled Sep 4 00:52:28.914901 kernel: printk: legacy console [ttyS1] enabled Sep 4 00:52:28.914906 kernel: ACPI: Core revision 20240827 Sep 4 00:52:28.914912 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Sep 4 00:52:28.914917 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 00:52:28.914922 kernel: DMAR: Host address width 39 Sep 4 00:52:28.914928 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 4 00:52:28.914933 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 4 00:52:28.914939 kernel: DMAR: RMRR base: 0x0000008cf10000 end: 0x0000008d159fff Sep 4 00:52:28.914945 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Sep 4 00:52:28.914950 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 4 00:52:28.914956 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 4 00:52:28.914961 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 4 00:52:28.914966 kernel: x2apic enabled Sep 4 00:52:28.914971 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 4 00:52:28.914977 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 4 00:52:28.914982 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 4 00:52:28.914988 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 4 00:52:28.914994 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 4 00:52:28.914999 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 4 00:52:28.915004 kernel: process: using mwait in idle threads Sep 4 00:52:28.915009 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 00:52:28.915015 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 4 00:52:28.915020 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 4 00:52:28.915025 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 4 00:52:28.915030 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 4 00:52:28.915036 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 00:52:28.915041 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 00:52:28.915047 kernel: TAA: Mitigation: TSX disabled Sep 4 00:52:28.915052 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Sep 4 00:52:28.915057 kernel: SRBDS: Mitigation: Microcode Sep 4 00:52:28.915063 kernel: GDS: Vulnerable: No microcode Sep 4 00:52:28.915068 kernel: active return thunk: its_return_thunk Sep 4 00:52:28.915073 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 4 00:52:28.915078 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 00:52:28.915083 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 00:52:28.915089 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 00:52:28.915094 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 4 00:52:28.915099 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 4 00:52:28.915105 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 00:52:28.915119 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 4 00:52:28.915125 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 4 00:52:28.915155 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 4 00:52:28.915161 kernel: Freeing SMP alternatives memory: 32K Sep 4 00:52:28.915166 kernel: pid_max: default: 32768 minimum: 301 Sep 4 00:52:28.915171 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 00:52:28.915177 kernel: landlock: Up and running. Sep 4 00:52:28.915196 kernel: SELinux: Initializing. Sep 4 00:52:28.915201 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 00:52:28.915207 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 00:52:28.915213 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 4 00:52:28.915219 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 4 00:52:28.915224 kernel: ... version: 4 Sep 4 00:52:28.915229 kernel: ... bit width: 48 Sep 4 00:52:28.915235 kernel: ... generic registers: 4 Sep 4 00:52:28.915240 kernel: ... value mask: 0000ffffffffffff Sep 4 00:52:28.915245 kernel: ... max period: 00007fffffffffff Sep 4 00:52:28.915250 kernel: ... fixed-purpose events: 3 Sep 4 00:52:28.915256 kernel: ... event mask: 000000070000000f Sep 4 00:52:28.915261 kernel: signal: max sigframe size: 2032 Sep 4 00:52:28.915267 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 4 00:52:28.915273 kernel: rcu: Hierarchical SRCU implementation. Sep 4 00:52:28.915278 kernel: rcu: Max phase no-delay instances is 400. Sep 4 00:52:28.915283 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 4 00:52:28.915289 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 4 00:52:28.915294 kernel: smp: Bringing up secondary CPUs ... Sep 4 00:52:28.915299 kernel: smpboot: x86: Booting SMP configuration: Sep 4 00:52:28.915305 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 4 00:52:28.915310 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 4 00:52:28.915317 kernel: smp: Brought up 1 node, 16 CPUs Sep 4 00:52:28.915322 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 4 00:52:28.915327 kernel: Memory: 32697172K/33452948K available (14336K kernel code, 2428K rwdata, 9956K rodata, 53832K init, 1088K bss, 730476K reserved, 0K cma-reserved) Sep 4 00:52:28.915333 kernel: devtmpfs: initialized Sep 4 00:52:28.915338 kernel: x86/mm: Memory block size: 128MB Sep 4 00:52:28.915343 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81b18000-0x81b18fff] (4096 bytes) Sep 4 00:52:28.915349 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c233000-0x8c664fff] (4399104 bytes) Sep 4 00:52:28.915354 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 00:52:28.915360 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 4 00:52:28.915366 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 00:52:28.915371 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 00:52:28.915376 kernel: audit: initializing netlink subsys (disabled) Sep 4 00:52:28.915382 kernel: audit: type=2000 audit(1756947141.041:1): state=initialized audit_enabled=0 res=1 Sep 4 00:52:28.915387 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 00:52:28.915392 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 00:52:28.915397 kernel: cpuidle: using governor menu Sep 4 00:52:28.915402 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 00:52:28.915409 kernel: dca service started, version 1.12.1 Sep 4 00:52:28.915414 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 4 00:52:28.915419 kernel: PCI: Using configuration type 1 for base access Sep 4 00:52:28.915425 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 00:52:28.915430 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 00:52:28.915435 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 00:52:28.915441 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 00:52:28.915446 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 00:52:28.915451 kernel: ACPI: Added _OSI(Module Device) Sep 4 00:52:28.915457 kernel: ACPI: Added _OSI(Processor Device) Sep 4 00:52:28.915463 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 00:52:28.915468 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 4 00:52:28.915473 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915478 kernel: ACPI: SSDT 0xFFFF9B76820D5800 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Sep 4 00:52:28.915484 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915489 kernel: ACPI: SSDT 0xFFFF9B76821A7000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 4 00:52:28.915494 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915500 kernel: ACPI: SSDT 0xFFFF9B7680246200 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 4 00:52:28.915506 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915511 kernel: ACPI: SSDT 0xFFFF9B76821A2800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 4 00:52:28.915516 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915522 kernel: ACPI: SSDT 0xFFFF9B76801A4000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 4 00:52:28.915527 kernel: ACPI: Dynamic OEM Table Load: Sep 4 00:52:28.915532 kernel: ACPI: SSDT 0xFFFF9B76820D2800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Sep 4 00:52:28.915537 kernel: ACPI: Interpreter enabled Sep 4 00:52:28.915543 kernel: ACPI: PM: (supports S0 S5) Sep 4 00:52:28.915548 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 00:52:28.915553 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 4 00:52:28.915559 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 4 00:52:28.915565 kernel: HEST: Table parsing has been initialized. Sep 4 00:52:28.915570 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 4 00:52:28.915576 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 00:52:28.915581 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 00:52:28.915586 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 4 00:52:28.915592 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 4 00:52:28.915597 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 4 00:52:28.915602 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 4 00:52:28.915609 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 4 00:52:28.915614 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 4 00:52:28.915620 kernel: ACPI: \_TZ_.FN00: New power resource Sep 4 00:52:28.915625 kernel: ACPI: \_TZ_.FN01: New power resource Sep 4 00:52:28.915630 kernel: ACPI: \_TZ_.FN02: New power resource Sep 4 00:52:28.915635 kernel: ACPI: \_TZ_.FN03: New power resource Sep 4 00:52:28.915641 kernel: ACPI: \_TZ_.FN04: New power resource Sep 4 00:52:28.915646 kernel: ACPI: \PIN_: New power resource Sep 4 00:52:28.915651 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 4 00:52:28.915727 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 00:52:28.915778 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 4 00:52:28.915825 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 4 00:52:28.915833 kernel: PCI host bridge to bus 0000:00 Sep 4 00:52:28.915882 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 00:52:28.915925 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 00:52:28.915969 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 00:52:28.916010 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Sep 4 00:52:28.916051 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 4 00:52:28.916093 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 4 00:52:28.916195 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 4 00:52:28.916267 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.916319 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:52:28.916367 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 4 00:52:28.916414 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:52:28.916461 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.916515 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 4 00:52:28.916562 kernel: pci 0000:00:08.0: BAR 0 [mem 0x95520000-0x95520fff 64bit] Sep 4 00:52:28.916613 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 4 00:52:28.916662 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Sep 4 00:52:28.916714 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 4 00:52:28.916761 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Sep 4 00:52:28.916807 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 4 00:52:28.916857 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 4 00:52:28.916904 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Sep 4 00:52:28.916953 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551e000-0x9551efff 64bit] Sep 4 00:52:28.917010 kernel: pci 0000:00:14.5: [8086:a375] type 00 class 0x080501 conventional PCI endpoint Sep 4 00:52:28.917060 kernel: pci 0000:00:14.5: BAR 0 [mem 0x9551d000-0x9551dfff 64bit] Sep 4 00:52:28.917113 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:52:28.917237 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:52:28.917290 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:52:28.917339 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:52:28.917390 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:52:28.917436 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Sep 4 00:52:28.917482 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 4 00:52:28.917532 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:52:28.917578 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Sep 4 00:52:28.917627 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 4 00:52:28.917676 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:52:28.917723 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Sep 4 00:52:28.917769 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 4 00:52:28.917819 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Sep 4 00:52:28.917867 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Sep 4 00:52:28.917914 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Sep 4 00:52:28.917960 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Sep 4 00:52:28.918007 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Sep 4 00:52:28.918053 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Sep 4 00:52:28.918099 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Sep 4 00:52:28.918205 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 4 00:52:28.918259 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.918307 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:52:28.918354 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.918407 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.918455 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:52:28.918502 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 4 00:52:28.918551 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 4 00:52:28.918598 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.918652 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.918700 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:52:28.918747 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 4 00:52:28.918794 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 4 00:52:28.918840 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.918892 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.918941 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:52:28.918989 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.919042 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Sep 4 00:52:28.919089 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:52:28.919168 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 4 00:52:28.919248 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:52:28.919296 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.919348 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 4 00:52:28.919395 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 4 00:52:28.919446 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 4 00:52:28.919497 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 4 00:52:28.919544 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Sep 4 00:52:28.919590 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 4 00:52:28.919645 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 4 00:52:28.919692 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 4 00:52:28.919746 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 4 00:52:28.919794 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 4 00:52:28.919843 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Sep 4 00:52:28.919890 kernel: pci 0000:01:00.0: PME# supported from D3cold Sep 4 00:52:28.919937 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 4 00:52:28.919987 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 4 00:52:28.920040 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 4 00:52:28.920089 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 4 00:52:28.920200 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Sep 4 00:52:28.920265 kernel: pci 0000:01:00.1: PME# supported from D3cold Sep 4 00:52:28.920313 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 4 00:52:28.920360 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 4 00:52:28.920411 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:52:28.920458 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:52:28.920510 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Sep 4 00:52:28.920558 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 4 00:52:28.920605 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Sep 4 00:52:28.920653 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Sep 4 00:52:28.920700 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Sep 4 00:52:28.920749 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.920797 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:52:28.920849 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 4 00:52:28.920897 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 4 00:52:28.920944 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Sep 4 00:52:28.920992 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Sep 4 00:52:28.921039 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Sep 4 00:52:28.921089 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 4 00:52:28.921201 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:52:28.921279 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:52:28.921331 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 4 00:52:28.921380 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:52:28.921427 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 4 00:52:28.921475 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:52:28.921525 kernel: pci 0000:06:00.0: enabling Extended Tags Sep 4 00:52:28.921573 kernel: pci 0000:06:00.0: supports D1 D2 Sep 4 00:52:28.921621 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 00:52:28.921667 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:52:28.921721 kernel: pci_bus 0000:07: extended config space not accessible Sep 4 00:52:28.921778 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 4 00:52:28.921884 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Sep 4 00:52:28.921937 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Sep 4 00:52:28.921987 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Sep 4 00:52:28.922037 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 00:52:28.922087 kernel: pci 0000:07:00.0: supports D1 D2 Sep 4 00:52:28.922179 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 4 00:52:28.922295 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:52:28.922306 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 4 00:52:28.922314 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 4 00:52:28.922320 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 4 00:52:28.922340 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 4 00:52:28.922346 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 4 00:52:28.922351 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 4 00:52:28.922357 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 4 00:52:28.922363 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 4 00:52:28.922368 kernel: iommu: Default domain type: Translated Sep 4 00:52:28.922374 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 00:52:28.922380 kernel: PCI: Using ACPI for IRQ routing Sep 4 00:52:28.922386 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 00:52:28.922393 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Sep 4 00:52:28.922399 kernel: e820: reserve RAM buffer [mem 0x81b18000-0x83ffffff] Sep 4 00:52:28.922405 kernel: e820: reserve RAM buffer [mem 0x8afc5000-0x8bffffff] Sep 4 00:52:28.922434 kernel: e820: reserve RAM buffer [mem 0x8c233000-0x8fffffff] Sep 4 00:52:28.922443 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Sep 4 00:52:28.922451 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Sep 4 00:52:28.922531 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Sep 4 00:52:28.922621 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Sep 4 00:52:28.922680 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 00:52:28.922688 kernel: vgaarb: loaded Sep 4 00:52:28.922695 kernel: clocksource: Switched to clocksource tsc-early Sep 4 00:52:28.922701 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 00:52:28.922707 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 00:52:28.922713 kernel: pnp: PnP ACPI init Sep 4 00:52:28.922779 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 4 00:52:28.922830 kernel: pnp 00:02: [dma 0 disabled] Sep 4 00:52:28.922881 kernel: pnp 00:03: [dma 0 disabled] Sep 4 00:52:28.922932 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 4 00:52:28.922979 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 4 00:52:28.923047 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 4 00:52:28.923093 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 4 00:52:28.923161 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 4 00:52:28.923206 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 4 00:52:28.923250 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 4 00:52:28.923293 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 4 00:52:28.923337 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 4 00:52:28.923380 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 4 00:52:28.923499 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 4 00:52:28.923642 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 4 00:52:28.923689 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 4 00:52:28.923738 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 4 00:52:28.923843 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 4 00:52:28.923905 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 4 00:52:28.923956 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 4 00:52:28.924005 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 4 00:52:28.924016 kernel: pnp: PnP ACPI: found 9 devices Sep 4 00:52:28.924022 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 00:52:28.924028 kernel: NET: Registered PF_INET protocol family Sep 4 00:52:28.924034 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 00:52:28.924040 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 4 00:52:28.924046 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 00:52:28.924052 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 00:52:28.924059 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 4 00:52:28.924065 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 4 00:52:28.924072 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 00:52:28.924078 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 00:52:28.924084 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 00:52:28.924090 kernel: NET: Registered PF_XDP protocol family Sep 4 00:52:28.924171 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Sep 4 00:52:28.924223 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Sep 4 00:52:28.924275 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Sep 4 00:52:28.924329 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 4 00:52:28.924381 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 4 00:52:28.924433 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 4 00:52:28.924551 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 4 00:52:28.924680 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 4 00:52:28.924731 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 4 00:52:28.924784 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:52:28.924834 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 4 00:52:28.924898 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 4 00:52:28.924946 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 4 00:52:28.924995 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 4 00:52:28.925043 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 4 00:52:28.925092 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 4 00:52:28.925193 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 4 00:52:28.925240 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 4 00:52:28.925288 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 4 00:52:28.925338 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 4 00:52:28.925387 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:52:28.925434 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 4 00:52:28.925481 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 4 00:52:28.925527 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 4 00:52:28.925572 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 4 00:52:28.925613 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 00:52:28.925654 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 00:52:28.925695 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 00:52:28.925738 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Sep 4 00:52:28.925779 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 4 00:52:28.925826 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Sep 4 00:52:28.925870 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 4 00:52:28.925916 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Sep 4 00:52:28.925960 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Sep 4 00:52:28.926008 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 4 00:52:28.926051 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Sep 4 00:52:28.926100 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Sep 4 00:52:28.926167 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Sep 4 00:52:28.926240 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 4 00:52:28.926284 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Sep 4 00:52:28.926292 kernel: PCI: CLS 64 bytes, default 64 Sep 4 00:52:28.926300 kernel: DMAR: No ATSR found Sep 4 00:52:28.926305 kernel: DMAR: No SATC found Sep 4 00:52:28.926311 kernel: DMAR: dmar0: Using Queued invalidation Sep 4 00:52:28.926359 kernel: pci 0000:00:00.0: Adding to iommu group 0 Sep 4 00:52:28.926407 kernel: pci 0000:00:01.0: Adding to iommu group 1 Sep 4 00:52:28.926485 kernel: pci 0000:00:08.0: Adding to iommu group 2 Sep 4 00:52:28.926532 kernel: pci 0000:00:12.0: Adding to iommu group 3 Sep 4 00:52:28.926579 kernel: pci 0000:00:14.0: Adding to iommu group 4 Sep 4 00:52:28.926627 kernel: pci 0000:00:14.2: Adding to iommu group 4 Sep 4 00:52:28.926675 kernel: pci 0000:00:14.5: Adding to iommu group 4 Sep 4 00:52:28.926721 kernel: pci 0000:00:15.0: Adding to iommu group 5 Sep 4 00:52:28.926768 kernel: pci 0000:00:15.1: Adding to iommu group 5 Sep 4 00:52:28.926814 kernel: pci 0000:00:16.0: Adding to iommu group 6 Sep 4 00:52:28.926861 kernel: pci 0000:00:16.1: Adding to iommu group 6 Sep 4 00:52:28.926907 kernel: pci 0000:00:16.4: Adding to iommu group 6 Sep 4 00:52:28.926953 kernel: pci 0000:00:17.0: Adding to iommu group 7 Sep 4 00:52:28.927000 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Sep 4 00:52:28.927049 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Sep 4 00:52:28.927096 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Sep 4 00:52:28.927179 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Sep 4 00:52:28.927269 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Sep 4 00:52:28.927316 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Sep 4 00:52:28.927363 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Sep 4 00:52:28.927410 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Sep 4 00:52:28.927458 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Sep 4 00:52:28.927507 kernel: pci 0000:01:00.0: Adding to iommu group 1 Sep 4 00:52:28.927555 kernel: pci 0000:01:00.1: Adding to iommu group 1 Sep 4 00:52:28.927603 kernel: pci 0000:03:00.0: Adding to iommu group 15 Sep 4 00:52:28.927651 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 4 00:52:28.927699 kernel: pci 0000:06:00.0: Adding to iommu group 17 Sep 4 00:52:28.927749 kernel: pci 0000:07:00.0: Adding to iommu group 17 Sep 4 00:52:28.927758 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 4 00:52:28.927764 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 4 00:52:28.927771 kernel: software IO TLB: mapped [mem 0x0000000086fc5000-0x000000008afc5000] (64MB) Sep 4 00:52:28.927777 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Sep 4 00:52:28.927782 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 4 00:52:28.927788 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 4 00:52:28.927794 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 4 00:52:28.927873 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 4 00:52:28.927882 kernel: Initialise system trusted keyrings Sep 4 00:52:28.927888 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 4 00:52:28.927895 kernel: Key type asymmetric registered Sep 4 00:52:28.927901 kernel: Asymmetric key parser 'x509' registered Sep 4 00:52:28.927906 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Sep 4 00:52:28.927912 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Sep 4 00:52:28.927917 kernel: clocksource: Switched to clocksource tsc Sep 4 00:52:28.927923 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 00:52:28.927929 kernel: io scheduler mq-deadline registered Sep 4 00:52:28.927934 kernel: io scheduler kyber registered Sep 4 00:52:28.927940 kernel: io scheduler bfq registered Sep 4 00:52:28.927987 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Sep 4 00:52:28.928035 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Sep 4 00:52:28.928082 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Sep 4 00:52:28.928158 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Sep 4 00:52:28.928252 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Sep 4 00:52:28.928299 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Sep 4 00:52:28.928351 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 4 00:52:28.928360 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 4 00:52:28.928368 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 4 00:52:28.928373 kernel: pstore: Using crash dump compression: deflate Sep 4 00:52:28.928379 kernel: pstore: Registered erst as persistent store backend Sep 4 00:52:28.928385 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 00:52:28.928390 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 00:52:28.928396 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 00:52:28.928401 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 4 00:52:28.928407 kernel: hpet_acpi_add: no address or irqs in _CRS Sep 4 00:52:28.928455 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 4 00:52:28.928466 kernel: i8042: PNP: No PS/2 controller found. Sep 4 00:52:28.928508 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 4 00:52:28.928552 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 4 00:52:28.928594 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-04T00:52:27 UTC (1756947147) Sep 4 00:52:28.928637 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 4 00:52:28.928645 kernel: intel_pstate: Intel P-state driver initializing Sep 4 00:52:28.928651 kernel: intel_pstate: Disabling energy efficiency optimization Sep 4 00:52:28.928658 kernel: intel_pstate: HWP enabled Sep 4 00:52:28.928664 kernel: NET: Registered PF_INET6 protocol family Sep 4 00:52:28.928670 kernel: Segment Routing with IPv6 Sep 4 00:52:28.928675 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 00:52:28.928681 kernel: NET: Registered PF_PACKET protocol family Sep 4 00:52:28.928686 kernel: Key type dns_resolver registered Sep 4 00:52:28.928692 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 4 00:52:28.928697 kernel: microcode: Current revision: 0x000000f4 Sep 4 00:52:28.928703 kernel: IPI shorthand broadcast: enabled Sep 4 00:52:28.928709 kernel: sched_clock: Marking stable (3651156048, 1495910475)->(6758645315, -1611578792) Sep 4 00:52:28.928715 kernel: registered taskstats version 1 Sep 4 00:52:28.928721 kernel: Loading compiled-in X.509 certificates Sep 4 00:52:28.928726 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 247a8159a15e16f8eb89737aa66cd9cf9bbb3c10' Sep 4 00:52:28.928732 kernel: Demotion targets for Node 0: null Sep 4 00:52:28.928737 kernel: Key type .fscrypt registered Sep 4 00:52:28.928743 kernel: Key type fscrypt-provisioning registered Sep 4 00:52:28.928748 kernel: ima: Allocated hash algorithm: sha1 Sep 4 00:52:28.928754 kernel: ima: No architecture policies found Sep 4 00:52:28.928760 kernel: clk: Disabling unused clocks Sep 4 00:52:28.928766 kernel: Warning: unable to open an initial console. Sep 4 00:52:28.928772 kernel: Freeing unused kernel image (initmem) memory: 53832K Sep 4 00:52:28.928777 kernel: Write protecting the kernel read-only data: 24576k Sep 4 00:52:28.928783 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Sep 4 00:52:28.928788 kernel: Run /init as init process Sep 4 00:52:28.928794 kernel: with arguments: Sep 4 00:52:28.928800 kernel: /init Sep 4 00:52:28.928805 kernel: with environment: Sep 4 00:52:28.928811 kernel: HOME=/ Sep 4 00:52:28.928817 kernel: TERM=linux Sep 4 00:52:28.928822 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 00:52:28.928829 systemd[1]: Successfully made /usr/ read-only. Sep 4 00:52:28.928836 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 00:52:28.928843 systemd[1]: Detected architecture x86-64. Sep 4 00:52:28.928848 systemd[1]: Running in initrd. Sep 4 00:52:28.928855 systemd[1]: No hostname configured, using default hostname. Sep 4 00:52:28.928860 systemd[1]: Hostname set to . Sep 4 00:52:28.928866 systemd[1]: Initializing machine ID from random generator. Sep 4 00:52:28.928872 systemd[1]: Queued start job for default target initrd.target. Sep 4 00:52:28.928878 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:52:28.928884 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:52:28.928890 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 00:52:28.928896 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 00:52:28.928903 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 00:52:28.928909 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 00:52:28.928916 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 00:52:28.928922 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 00:52:28.928928 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:52:28.928934 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:52:28.928939 systemd[1]: Reached target paths.target - Path Units. Sep 4 00:52:28.928946 systemd[1]: Reached target slices.target - Slice Units. Sep 4 00:52:28.928952 systemd[1]: Reached target swap.target - Swaps. Sep 4 00:52:28.928958 systemd[1]: Reached target timers.target - Timer Units. Sep 4 00:52:28.928964 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 00:52:28.928970 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 00:52:28.928976 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 00:52:28.928982 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 00:52:28.928988 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:52:28.928993 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 00:52:28.929000 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:52:28.929006 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 00:52:28.929012 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 00:52:28.929018 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 00:52:28.929024 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 00:52:28.929030 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 00:52:28.929036 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 00:52:28.929041 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 00:52:28.929048 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 00:52:28.929066 systemd-journald[297]: Collecting audit messages is disabled. Sep 4 00:52:28.929080 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:52:28.929087 systemd-journald[297]: Journal started Sep 4 00:52:28.929101 systemd-journald[297]: Runtime Journal (/run/log/journal/b9bb140f30104bfa8c0d01e447e9b8c3) is 8M, max 640.1M, 632.1M free. Sep 4 00:52:28.922670 systemd-modules-load[299]: Inserted module 'overlay' Sep 4 00:52:28.940855 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 00:52:28.978540 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 00:52:28.978553 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 00:52:28.978562 kernel: Bridge firewalling registered Sep 4 00:52:28.946472 systemd-modules-load[299]: Inserted module 'br_netfilter' Sep 4 00:52:28.978612 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:52:29.012407 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 00:52:29.035513 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 00:52:29.035783 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:52:29.056714 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 00:52:29.072860 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 00:52:29.092969 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 00:52:29.114889 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 00:52:29.129803 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:52:29.132243 systemd-tmpfiles[321]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 00:52:29.132712 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 00:52:29.133394 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 00:52:29.134014 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:52:29.134972 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 00:52:29.138757 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:52:29.144335 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 00:52:29.145041 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 00:52:29.158867 systemd-resolved[329]: Positive Trust Anchors: Sep 4 00:52:29.158873 systemd-resolved[329]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 00:52:29.158902 systemd-resolved[329]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 00:52:29.160891 systemd-resolved[329]: Defaulting to hostname 'linux'. Sep 4 00:52:29.165509 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 00:52:29.181447 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:52:29.312594 dracut-cmdline[343]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=c7fa427551c105672074cbcbe7e23c997f471a6e879d708e8d6cbfad2147666e Sep 4 00:52:29.519155 kernel: SCSI subsystem initialized Sep 4 00:52:29.532115 kernel: Loading iSCSI transport class v2.0-870. Sep 4 00:52:29.544144 kernel: iscsi: registered transport (tcp) Sep 4 00:52:29.568247 kernel: iscsi: registered transport (qla4xxx) Sep 4 00:52:29.568264 kernel: QLogic iSCSI HBA Driver Sep 4 00:52:29.578393 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 00:52:29.609204 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:52:29.610238 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 00:52:29.724861 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 00:52:29.728365 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 00:52:29.856154 kernel: raid6: avx2x4 gen() 15990 MB/s Sep 4 00:52:29.877144 kernel: raid6: avx2x2 gen() 37639 MB/s Sep 4 00:52:29.903186 kernel: raid6: avx2x1 gen() 45295 MB/s Sep 4 00:52:29.903202 kernel: raid6: using algorithm avx2x1 gen() 45295 MB/s Sep 4 00:52:29.930271 kernel: raid6: .... xor() 24467 MB/s, rmw enabled Sep 4 00:52:29.930287 kernel: raid6: using avx2x2 recovery algorithm Sep 4 00:52:29.951145 kernel: xor: automatically using best checksumming function avx Sep 4 00:52:30.055122 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 00:52:30.058061 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 00:52:30.059051 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:52:30.106125 systemd-udevd[554]: Using default interface naming scheme 'v255'. Sep 4 00:52:30.109671 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:52:30.116976 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 00:52:30.169194 dracut-pre-trigger[565]: rd.md=0: removing MD RAID activation Sep 4 00:52:30.182759 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 00:52:30.194335 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 00:52:30.323288 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:52:30.346216 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 00:52:30.324682 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 00:52:30.372220 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 4 00:52:30.372234 kernel: ACPI: bus type USB registered Sep 4 00:52:30.372242 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 4 00:52:30.372249 kernel: usbcore: registered new interface driver usbfs Sep 4 00:52:30.378118 kernel: AES CTR mode by8 optimization enabled Sep 4 00:52:30.384119 kernel: usbcore: registered new interface driver hub Sep 4 00:52:30.384136 kernel: usbcore: registered new device driver usb Sep 4 00:52:30.396987 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 00:52:30.419270 kernel: sdhci: Secure Digital Host Controller Interface driver Sep 4 00:52:30.419284 kernel: libata version 3.00 loaded. Sep 4 00:52:30.419292 kernel: sdhci: Copyright(c) Pierre Ossman Sep 4 00:52:30.419299 kernel: PTP clock support registered Sep 4 00:52:30.397164 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:52:30.493529 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 4 00:52:30.493635 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 4 00:52:30.493711 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 4 00:52:30.493789 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 4 00:52:30.493898 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 4 00:52:30.493989 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 4 00:52:30.494074 kernel: hub 1-0:1.0: USB hub found Sep 4 00:52:30.494170 kernel: hub 1-0:1.0: 16 ports detected Sep 4 00:52:30.494262 kernel: hub 2-0:1.0: USB hub found Sep 4 00:52:30.494370 kernel: hub 2-0:1.0: 10 ports detected Sep 4 00:52:30.489071 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:52:30.754100 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 4 00:52:30.754121 kernel: ahci 0000:00:17.0: version 3.0 Sep 4 00:52:30.754214 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 4 00:52:30.754223 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Sep 4 00:52:30.754294 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Sep 4 00:52:30.754380 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:30.754473 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 4 00:52:30.754565 kernel: igb 0000:03:00.0: added PHC on eth0 Sep 4 00:52:30.754659 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 4 00:52:30.754750 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d2:5e Sep 4 00:52:30.754838 kernel: scsi host0: ahci Sep 4 00:52:30.754929 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Sep 4 00:52:30.755020 kernel: scsi host1: ahci Sep 4 00:52:30.755106 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 4 00:52:30.755204 kernel: scsi host2: ahci Sep 4 00:52:30.755290 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:30.755378 kernel: scsi host3: ahci Sep 4 00:52:30.755463 kernel: scsi host4: ahci Sep 4 00:52:30.755544 kernel: scsi host5: ahci Sep 4 00:52:30.755618 kernel: scsi host6: ahci Sep 4 00:52:30.755677 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 140 lpm-pol 0 Sep 4 00:52:30.755685 kernel: igb 0000:04:00.0: added PHC on eth1 Sep 4 00:52:30.755748 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 140 lpm-pol 0 Sep 4 00:52:30.755756 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 4 00:52:30.755817 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 140 lpm-pol 0 Sep 4 00:52:30.755825 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d2:5f Sep 4 00:52:30.755885 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 140 lpm-pol 0 Sep 4 00:52:30.755893 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Sep 4 00:52:30.755954 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 140 lpm-pol 0 Sep 4 00:52:30.755964 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 4 00:52:30.756024 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:30.756083 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Sep 4 00:52:30.756150 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 140 lpm-pol 0 Sep 4 00:52:30.756158 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 140 lpm-pol 0 Sep 4 00:52:30.756165 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:30.756224 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Sep 4 00:52:30.756283 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 4 00:52:30.494074 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:52:30.604691 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:52:30.778122 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:52:30.879396 kernel: hub 1-14:1.0: USB hub found Sep 4 00:52:30.879497 kernel: hub 1-14:1.0: 4 ports detected Sep 4 00:52:30.880119 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:30.903609 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Sep 4 00:52:30.903700 kernel: mlx5_core 0000:01:00.0: firmware version: 14.28.2006 Sep 4 00:52:30.912699 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 4 00:52:31.041141 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 4 00:52:31.041158 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 4 00:52:31.047389 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 4 00:52:31.053140 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 4 00:52:31.059175 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 4 00:52:31.065168 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 4 00:52:31.071146 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 4 00:52:31.076169 kernel: ata1.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 4 00:52:31.092697 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 4 00:52:31.093144 kernel: ata2.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 4 00:52:31.109899 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 4 00:52:31.124611 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 4 00:52:31.124629 kernel: ata1.00: Features: NCQ-prio Sep 4 00:52:31.125182 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 4 00:52:31.135878 kernel: ata2.00: Features: NCQ-prio Sep 4 00:52:31.145154 kernel: ata1.00: configured for UDMA/133 Sep 4 00:52:31.145198 kernel: ata2.00: configured for UDMA/133 Sep 4 00:52:31.145206 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 4 00:52:31.158165 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 4 00:52:31.167116 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 4 00:52:31.167140 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 4 00:52:31.191083 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:52:31.191101 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:52:31.191109 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 4 00:52:31.195829 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Sep 4 00:52:31.195946 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 4 00:52:31.203284 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Sep 4 00:52:31.210511 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 4 00:52:31.210595 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 4 00:52:31.217985 kernel: sd 1:0:0:0: [sdb] Write Protect is off Sep 4 00:52:31.223272 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 4 00:52:31.223366 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 4 00:52:31.228536 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 4 00:52:31.233327 kernel: sd 0:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 4 00:52:31.238154 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 4 00:52:31.270740 kernel: ata1.00: Enabling discard_zeroes_data Sep 4 00:52:31.270756 kernel: sd 1:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 4 00:52:31.277117 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.277212 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:52:31.297204 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 4 00:52:31.310669 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 4 00:52:31.310754 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.323785 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 00:52:31.323805 kernel: GPT:9289727 != 937703087 Sep 4 00:52:31.330060 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 00:52:31.333929 kernel: GPT:9289727 != 937703087 Sep 4 00:52:31.339483 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 00:52:31.344722 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:52:31.349758 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Sep 4 00:52:31.357398 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.369163 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.369266 kernel: usbcore: registered new interface driver usbhid Sep 4 00:52:31.369275 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.374158 kernel: usbhid: USB HID core driver Sep 4 00:52:31.391082 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 4 00:52:31.421341 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 4 00:52:31.407376 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5200_MTFDDAK480TDN EFI-SYSTEM. Sep 4 00:52:31.426943 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 4 00:52:31.517160 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 4 00:52:31.517268 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.517339 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 4 00:52:31.517347 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 4 00:52:31.517418 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.442472 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 4 00:52:31.544676 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5200_MTFDDAK480TDN ROOT. Sep 4 00:52:31.609216 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 4 00:52:31.609309 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.609377 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Sep 4 00:52:31.609444 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:31.609503 kernel: mlx5_core 0000:01:00.1: firmware version: 14.28.2006 Sep 4 00:52:31.609564 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 4 00:52:31.599458 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 00:52:31.629356 disk-uuid[768]: Primary Header is updated. Sep 4 00:52:31.629356 disk-uuid[768]: Secondary Entries is updated. Sep 4 00:52:31.629356 disk-uuid[768]: Secondary Header is updated. Sep 4 00:52:31.654149 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:52:31.660117 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:52:31.886119 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Sep 4 00:52:31.897351 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Sep 4 00:52:32.142166 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 4 00:52:32.142297 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:32.158143 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:32.167118 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Sep 4 00:52:32.167255 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Sep 4 00:52:32.180496 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 00:52:32.180985 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 00:52:32.217214 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:52:32.217331 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 00:52:32.236284 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 00:52:32.288416 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 00:52:32.667663 kernel: ata2.00: Enabling discard_zeroes_data Sep 4 00:52:32.681529 disk-uuid[769]: The operation has completed successfully. Sep 4 00:52:32.689240 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 4 00:52:32.723651 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 00:52:32.723700 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 00:52:32.753467 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 00:52:32.784276 sh[807]: Success Sep 4 00:52:32.813343 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 00:52:32.813363 kernel: device-mapper: uevent: version 1.0.3 Sep 4 00:52:32.822589 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 00:52:32.835118 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 4 00:52:32.882990 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 00:52:32.893391 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 00:52:32.918914 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 00:52:32.967235 kernel: BTRFS: device fsid 8a9c2e34-3d3c-49a9-acce-59bf90003071 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (819) Sep 4 00:52:32.967252 kernel: BTRFS info (device dm-0): first mount of filesystem 8a9c2e34-3d3c-49a9-acce-59bf90003071 Sep 4 00:52:32.967259 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:52:32.983521 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 4 00:52:32.983537 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 00:52:32.989683 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 00:52:32.991636 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 00:52:32.999460 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 00:52:33.024349 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 00:52:33.024804 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 00:52:33.042043 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 00:52:33.101189 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sdb6 (8:22) scanned by mount (842) Sep 4 00:52:33.118523 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:52:33.118541 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:52:33.127505 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 00:52:33.171230 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:52:33.171257 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:52:33.171273 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:52:33.171288 kernel: BTRFS info (device sdb6): last unmount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:52:33.142993 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 00:52:33.182330 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 00:52:33.189377 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 00:52:33.198428 systemd-networkd[987]: lo: Link UP Sep 4 00:52:33.198430 systemd-networkd[987]: lo: Gained carrier Sep 4 00:52:33.200730 systemd-networkd[987]: Enumeration completed Sep 4 00:52:33.201315 systemd-networkd[987]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:52:33.203377 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 00:52:33.228638 systemd-networkd[987]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:52:33.229340 systemd[1]: Reached target network.target - Network. Sep 4 00:52:33.255916 systemd-networkd[987]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:52:33.304622 ignition[990]: Ignition 2.21.0 Sep 4 00:52:33.304627 ignition[990]: Stage: fetch-offline Sep 4 00:52:33.307449 unknown[990]: fetched base config from "system" Sep 4 00:52:33.304647 ignition[990]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:33.307455 unknown[990]: fetched user config from "system" Sep 4 00:52:33.304651 ignition[990]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:33.308674 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 00:52:33.304704 ignition[990]: parsed url from cmdline: "" Sep 4 00:52:33.310377 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 00:52:33.304706 ignition[990]: no config URL provided Sep 4 00:52:33.310838 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 00:52:33.304709 ignition[990]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 00:52:33.415341 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 4 00:52:33.415292 systemd-networkd[987]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 00:52:33.304731 ignition[990]: parsing config with SHA512: aeec26293140d6449e4d402084fd2bb871a7ef94bd2c9faa7474229b428e8cf955670eaab3adcd4cd1174c8916b8db565bc10a2e9da7b054cf1fbbefeccb04a6 Sep 4 00:52:33.307658 ignition[990]: fetch-offline: fetch-offline passed Sep 4 00:52:33.307662 ignition[990]: POST message to Packet Timeline Sep 4 00:52:33.307666 ignition[990]: POST Status error: resource requires networking Sep 4 00:52:33.307706 ignition[990]: Ignition finished successfully Sep 4 00:52:33.358163 ignition[1007]: Ignition 2.21.0 Sep 4 00:52:33.358170 ignition[1007]: Stage: kargs Sep 4 00:52:33.358306 ignition[1007]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:33.358315 ignition[1007]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:33.360010 ignition[1007]: kargs: kargs passed Sep 4 00:52:33.360019 ignition[1007]: POST message to Packet Timeline Sep 4 00:52:33.360049 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:52:33.360597 ignition[1007]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:41031->[::1]:53: read: connection refused Sep 4 00:52:33.560867 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #2 Sep 4 00:52:33.561123 ignition[1007]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:34435->[::1]:53: read: connection refused Sep 4 00:52:33.596204 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 4 00:52:33.597092 systemd-networkd[987]: eno1: Link UP Sep 4 00:52:33.597260 systemd-networkd[987]: eno2: Link UP Sep 4 00:52:33.597409 systemd-networkd[987]: enp1s0f0np0: Link UP Sep 4 00:52:33.597580 systemd-networkd[987]: enp1s0f0np0: Gained carrier Sep 4 00:52:33.610449 systemd-networkd[987]: enp1s0f1np1: Link UP Sep 4 00:52:33.611151 systemd-networkd[987]: enp1s0f1np1: Gained carrier Sep 4 00:52:33.642327 systemd-networkd[987]: enp1s0f0np0: DHCPv4 address 147.28.180.77/31, gateway 147.28.180.76 acquired from 145.40.83.140 Sep 4 00:52:33.962221 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #3 Sep 4 00:52:33.963381 ignition[1007]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:57853->[::1]:53: read: connection refused Sep 4 00:52:34.756451 systemd-networkd[987]: enp1s0f0np0: Gained IPv6LL Sep 4 00:52:34.757159 systemd-networkd[987]: enp1s0f1np1: Gained IPv6LL Sep 4 00:52:34.763463 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #4 Sep 4 00:52:34.764563 ignition[1007]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45547->[::1]:53: read: connection refused Sep 4 00:52:36.365335 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #5 Sep 4 00:52:36.366507 ignition[1007]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:37119->[::1]:53: read: connection refused Sep 4 00:52:39.569856 ignition[1007]: GET https://metadata.packet.net/metadata: attempt #6 Sep 4 00:52:40.599077 ignition[1007]: GET result: OK Sep 4 00:52:41.398673 ignition[1007]: Ignition finished successfully Sep 4 00:52:41.404799 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 00:52:41.414991 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 00:52:41.466465 ignition[1027]: Ignition 2.21.0 Sep 4 00:52:41.466470 ignition[1027]: Stage: disks Sep 4 00:52:41.466553 ignition[1027]: no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:41.466558 ignition[1027]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:41.467718 ignition[1027]: disks: disks passed Sep 4 00:52:41.467725 ignition[1027]: POST message to Packet Timeline Sep 4 00:52:41.467769 ignition[1027]: GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:52:42.445170 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:42.506201 ignition[1027]: GET result: OK Sep 4 00:52:42.928919 ignition[1027]: Ignition finished successfully Sep 4 00:52:42.933657 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 00:52:42.945322 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 00:52:42.953669 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 00:52:42.970690 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 00:52:42.999510 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 00:52:43.008669 systemd[1]: Reached target basic.target - Basic System. Sep 4 00:52:43.035209 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 00:52:43.082044 systemd-fsck[1051]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 4 00:52:43.090596 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 00:52:43.091380 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 00:52:43.209846 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 00:52:43.224335 kernel: EXT4-fs (sdb9): mounted filesystem c3518c93-f823-4477-a620-ff9666a59be5 r/w with ordered data mode. Quota mode: none. Sep 4 00:52:43.210150 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 00:52:43.225524 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 00:52:43.243326 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 00:52:43.262798 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 4 00:52:43.293685 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1059) Sep 4 00:52:43.293702 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:52:43.294477 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 4 00:52:43.332521 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:52:43.332536 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:52:43.332544 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:52:43.332551 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:52:43.332520 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 00:52:43.332539 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 00:52:43.394352 coreos-metadata[1061]: Sep 04 00:52:43.386 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:52:43.352210 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 00:52:43.414428 coreos-metadata[1062]: Sep 04 00:52:43.386 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:52:43.377396 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 00:52:43.403311 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 00:52:43.471165 initrd-setup-root[1091]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 00:52:43.481275 initrd-setup-root[1098]: cut: /sysroot/etc/group: No such file or directory Sep 4 00:52:43.491190 initrd-setup-root[1105]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 00:52:43.500184 initrd-setup-root[1112]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 00:52:43.538965 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 00:52:43.539768 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 00:52:43.556923 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 00:52:43.602009 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 00:52:43.618242 kernel: BTRFS info (device sdb6): last unmount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:52:43.626058 ignition[1180]: INFO : Ignition 2.21.0 Sep 4 00:52:43.626058 ignition[1180]: INFO : Stage: mount Sep 4 00:52:43.645397 ignition[1180]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:43.645397 ignition[1180]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:43.645397 ignition[1180]: INFO : mount: mount passed Sep 4 00:52:43.645397 ignition[1180]: INFO : POST message to Packet Timeline Sep 4 00:52:43.645397 ignition[1180]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:52:43.626607 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 00:52:44.351180 coreos-metadata[1062]: Sep 04 00:52:44.351 INFO Fetch successful Sep 4 00:52:44.385405 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 4 00:52:44.385476 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 4 00:52:44.452176 coreos-metadata[1061]: Sep 04 00:52:44.452 INFO Fetch successful Sep 4 00:52:44.527334 coreos-metadata[1061]: Sep 04 00:52:44.527 INFO wrote hostname ci-4372.1.0-n-fd36784ab7 to /sysroot/etc/hostname Sep 4 00:52:44.528781 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 4 00:52:44.618530 ignition[1180]: INFO : GET result: OK Sep 4 00:52:45.065093 ignition[1180]: INFO : Ignition finished successfully Sep 4 00:52:45.069325 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 00:52:45.087515 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 00:52:45.122022 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 00:52:45.166145 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1206) Sep 4 00:52:45.183489 kernel: BTRFS info (device sdb6): first mount of filesystem 75efd3be-3398-4525-8f67-b36cc847539d Sep 4 00:52:45.183505 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 4 00:52:45.198775 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 4 00:52:45.198792 kernel: BTRFS info (device sdb6): turning on async discard Sep 4 00:52:45.204888 kernel: BTRFS info (device sdb6): enabling free space tree Sep 4 00:52:45.206671 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 00:52:45.248172 ignition[1223]: INFO : Ignition 2.21.0 Sep 4 00:52:45.248172 ignition[1223]: INFO : Stage: files Sep 4 00:52:45.261373 ignition[1223]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:45.261373 ignition[1223]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:45.261373 ignition[1223]: DEBUG : files: compiled without relabeling support, skipping Sep 4 00:52:45.261373 ignition[1223]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 00:52:45.261373 ignition[1223]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 00:52:45.261373 ignition[1223]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 00:52:45.261373 ignition[1223]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 00:52:45.261373 ignition[1223]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 00:52:45.261373 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 00:52:45.261373 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 4 00:52:45.252978 unknown[1223]: wrote ssh authorized keys file for user: core Sep 4 00:52:45.386377 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 00:52:45.594381 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 00:52:45.594381 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 00:52:45.626343 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 4 00:52:46.281501 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 00:52:46.934744 ignition[1223]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 00:52:46.934744 ignition[1223]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 00:52:46.962420 ignition[1223]: INFO : files: files passed Sep 4 00:52:46.962420 ignition[1223]: INFO : POST message to Packet Timeline Sep 4 00:52:46.962420 ignition[1223]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:52:47.977947 ignition[1223]: INFO : GET result: OK Sep 4 00:52:48.426593 ignition[1223]: INFO : Ignition finished successfully Sep 4 00:52:48.430617 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 00:52:48.447537 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 00:52:48.453751 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 00:52:48.485871 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 00:52:48.485943 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 00:52:48.509626 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 00:52:48.531390 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 00:52:48.551743 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 00:52:48.579210 initrd-setup-root-after-ignition[1263]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:52:48.579210 initrd-setup-root-after-ignition[1263]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:52:48.608263 initrd-setup-root-after-ignition[1267]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 00:52:48.639788 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 00:52:48.639836 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 00:52:48.657474 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 00:52:48.668350 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 00:52:48.694438 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 00:52:48.696215 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 00:52:48.777292 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 00:52:48.792207 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 00:52:48.852189 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:52:48.863362 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:52:48.883439 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 00:52:48.901512 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 00:52:48.901686 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 00:52:48.926819 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 00:52:48.945643 systemd[1]: Stopped target basic.target - Basic System. Sep 4 00:52:48.962846 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 00:52:48.979724 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 00:52:48.998710 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 00:52:49.017717 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 00:52:49.036734 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 00:52:49.054727 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 00:52:49.074789 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 00:52:49.093746 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 00:52:49.111721 systemd[1]: Stopped target swap.target - Swaps. Sep 4 00:52:49.128640 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 00:52:49.129037 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 00:52:49.153763 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:52:49.172748 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:52:49.191600 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 00:52:49.192058 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:52:49.211614 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 00:52:49.212013 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 00:52:49.242739 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 00:52:49.243204 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 00:52:49.261922 systemd[1]: Stopped target paths.target - Path Units. Sep 4 00:52:49.278647 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 00:52:49.279093 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:52:49.297737 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 00:52:49.314714 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 00:52:49.331699 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 00:52:49.331997 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 00:52:49.349781 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 00:52:49.350064 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 00:52:49.370975 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 00:52:49.371421 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 00:52:49.387812 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 00:52:49.388221 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 00:52:49.510224 ignition[1288]: INFO : Ignition 2.21.0 Sep 4 00:52:49.510224 ignition[1288]: INFO : Stage: umount Sep 4 00:52:49.510224 ignition[1288]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 00:52:49.510224 ignition[1288]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 4 00:52:49.510224 ignition[1288]: INFO : umount: umount passed Sep 4 00:52:49.510224 ignition[1288]: INFO : POST message to Packet Timeline Sep 4 00:52:49.510224 ignition[1288]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 4 00:52:49.403716 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 4 00:52:49.404083 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 4 00:52:49.423197 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 00:52:49.435747 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 00:52:49.442441 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 00:52:49.442520 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:52:49.474375 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 00:52:49.474448 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 00:52:49.503548 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 00:52:49.504283 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 00:52:49.504354 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 00:52:49.521438 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 00:52:49.521529 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 00:52:50.464716 ignition[1288]: INFO : GET result: OK Sep 4 00:52:50.896668 ignition[1288]: INFO : Ignition finished successfully Sep 4 00:52:50.900615 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 00:52:50.900901 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 00:52:50.914319 systemd[1]: Stopped target network.target - Network. Sep 4 00:52:50.920619 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 00:52:50.920787 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 00:52:50.943499 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 00:52:50.943657 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 00:52:50.950643 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 00:52:50.950798 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 00:52:50.976661 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 00:52:50.976827 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 00:52:50.993513 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 00:52:50.993700 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 00:52:51.009860 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 00:52:51.027566 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 00:52:51.035323 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 00:52:51.035600 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 00:52:51.064638 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 00:52:51.065391 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 00:52:51.065671 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 00:52:51.071980 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 00:52:51.073942 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 00:52:51.085686 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 00:52:51.085808 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:52:51.104859 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 00:52:51.119454 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 00:52:51.119483 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 00:52:51.154274 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 00:52:51.154349 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:52:51.163657 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 00:52:51.163722 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 00:52:51.189503 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 00:52:51.189669 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:52:51.209794 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:52:51.231693 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 00:52:51.231893 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:52:51.232892 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 00:52:51.233274 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:52:51.241208 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 00:52:51.241349 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 00:52:51.257636 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 00:52:51.257739 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:52:51.283540 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 00:52:51.283682 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 00:52:51.318343 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 00:52:51.318576 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 00:52:51.355318 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 00:52:51.355687 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 00:52:51.392620 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 00:52:51.399522 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 00:52:51.665286 systemd-journald[297]: Received SIGTERM from PID 1 (systemd). Sep 4 00:52:51.399688 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:52:51.418853 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 00:52:51.418993 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:52:51.459681 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 4 00:52:51.459892 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 00:52:51.470832 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 00:52:51.470970 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:52:51.508329 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 00:52:51.508355 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:52:51.529068 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 4 00:52:51.529251 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 4 00:52:51.529370 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 4 00:52:51.529573 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 00:52:51.530733 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 00:52:51.531081 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 00:52:51.543991 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 00:52:51.544283 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 00:52:51.564743 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 00:52:51.575666 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 00:52:51.620433 systemd[1]: Switching root. Sep 4 00:52:51.792400 systemd-journald[297]: Journal stopped Sep 4 00:52:53.486157 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 00:52:53.486194 kernel: SELinux: policy capability open_perms=1 Sep 4 00:52:53.486202 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 00:52:53.486207 kernel: SELinux: policy capability always_check_network=0 Sep 4 00:52:53.486212 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 00:52:53.486217 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 00:52:53.486223 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 00:52:53.486229 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 00:52:53.486234 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 00:52:53.486240 kernel: audit: type=1403 audit(1756947171.911:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 00:52:53.486247 systemd[1]: Successfully loaded SELinux policy in 86.093ms. Sep 4 00:52:53.486254 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.314ms. Sep 4 00:52:53.486260 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 00:52:53.486266 systemd[1]: Detected architecture x86-64. Sep 4 00:52:53.486274 systemd[1]: Detected first boot. Sep 4 00:52:53.486280 systemd[1]: Hostname set to . Sep 4 00:52:53.486286 systemd[1]: Initializing machine ID from random generator. Sep 4 00:52:53.486292 zram_generator::config[1343]: No configuration found. Sep 4 00:52:53.486299 systemd[1]: Populated /etc with preset unit settings. Sep 4 00:52:53.486305 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 00:52:53.486312 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 00:52:53.486319 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 00:52:53.486325 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 00:52:53.486331 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 00:52:53.486337 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 00:52:53.486343 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 00:52:53.486349 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 00:52:53.486357 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 00:52:53.486363 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 00:52:53.486369 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 00:52:53.486376 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 00:52:53.486383 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 00:52:53.486390 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 00:52:53.486397 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 00:52:53.486403 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 00:52:53.486410 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 00:52:53.486417 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 00:52:53.486423 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 4 00:52:53.486429 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 00:52:53.486436 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 00:52:53.486444 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 00:52:53.486450 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 00:52:53.486456 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 00:52:53.486464 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 00:52:53.486470 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 00:52:53.486477 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 00:52:53.486483 systemd[1]: Reached target slices.target - Slice Units. Sep 4 00:52:53.486489 systemd[1]: Reached target swap.target - Swaps. Sep 4 00:52:53.486496 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 00:52:53.486502 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 00:52:53.486508 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 00:52:53.486518 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 00:52:53.486524 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 00:52:53.486531 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 00:52:53.486537 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 00:52:53.486544 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 00:52:53.486551 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 00:52:53.486558 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 00:52:53.486564 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:52:53.486571 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 00:52:53.486577 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 00:52:53.486586 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 00:52:53.486593 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 00:52:53.486599 systemd[1]: Reached target machines.target - Containers. Sep 4 00:52:53.486607 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 00:52:53.486613 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 00:52:53.486620 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 00:52:53.486626 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 00:52:53.486633 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 00:52:53.486639 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 00:52:53.486646 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 00:52:53.486652 kernel: ACPI: bus type drm_connector registered Sep 4 00:52:53.486658 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 00:52:53.486665 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 00:52:53.486672 kernel: fuse: init (API version 7.41) Sep 4 00:52:53.486677 kernel: loop: module loaded Sep 4 00:52:53.486684 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 00:52:53.486690 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 00:52:53.486697 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 00:52:53.486703 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 00:52:53.486709 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 00:52:53.486717 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 00:52:53.486724 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 00:52:53.486730 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 00:52:53.486737 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 00:52:53.486752 systemd-journald[1446]: Collecting audit messages is disabled. Sep 4 00:52:53.486769 systemd-journald[1446]: Journal started Sep 4 00:52:53.486783 systemd-journald[1446]: Runtime Journal (/run/log/journal/8067215602814bde96cecc954327c492) is 8M, max 640.1M, 632.1M free. Sep 4 00:52:52.344268 systemd[1]: Queued start job for default target multi-user.target. Sep 4 00:52:52.355030 systemd[1]: Unnecessary job was removed for dev-sdb6.device - /dev/sdb6. Sep 4 00:52:52.355300 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 00:52:53.507196 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 00:52:53.528163 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 00:52:53.548189 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 00:52:53.569296 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 00:52:53.569319 systemd[1]: Stopped verity-setup.service. Sep 4 00:52:53.594176 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:52:53.602148 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 00:52:53.610587 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 00:52:53.620432 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 00:52:53.630406 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 00:52:53.639388 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 00:52:53.648395 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 00:52:53.657365 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 00:52:53.666476 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 00:52:53.676511 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 00:52:53.686558 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 00:52:53.686764 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 00:52:53.696807 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 00:52:53.697086 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 00:52:53.707975 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 00:52:53.708445 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 00:52:53.717970 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 00:52:53.718659 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 00:52:53.728982 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 00:52:53.729464 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 00:52:53.739030 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 00:52:53.739526 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 00:52:53.750093 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 00:52:53.760039 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 00:52:53.771051 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 00:52:53.782190 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 00:52:53.793057 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 00:52:53.825714 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 00:52:53.837519 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 00:52:53.859487 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 00:52:53.868333 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 00:52:53.868353 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 00:52:53.878003 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 00:52:53.889401 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 00:52:53.898711 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 00:52:53.914846 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 00:52:53.932545 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 00:52:53.942275 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 00:52:53.943009 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 00:52:53.945499 systemd-journald[1446]: Time spent on flushing to /var/log/journal/8067215602814bde96cecc954327c492 is 12.971ms for 1415 entries. Sep 4 00:52:53.945499 systemd-journald[1446]: System Journal (/var/log/journal/8067215602814bde96cecc954327c492) is 8M, max 195.6M, 187.6M free. Sep 4 00:52:53.976641 systemd-journald[1446]: Received client request to flush runtime journal. Sep 4 00:52:53.960251 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 00:52:53.977672 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 00:52:53.992373 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 00:52:54.009358 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 00:52:54.021168 kernel: loop0: detected capacity change from 0 to 146240 Sep 4 00:52:54.026430 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 00:52:54.036282 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 00:52:54.046848 systemd-tmpfiles[1485]: ACLs are not supported, ignoring. Sep 4 00:52:54.046877 systemd-tmpfiles[1485]: ACLs are not supported, ignoring. Sep 4 00:52:54.051119 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 00:52:54.053051 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 00:52:54.063359 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 00:52:54.073356 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 00:52:54.084153 kernel: loop1: detected capacity change from 0 to 8 Sep 4 00:52:54.087355 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 00:52:54.098671 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 00:52:54.108965 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 00:52:54.121149 kernel: loop2: detected capacity change from 0 to 113872 Sep 4 00:52:54.133364 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 00:52:54.143541 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 00:52:54.162282 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 00:52:54.178151 kernel: loop3: detected capacity change from 0 to 221472 Sep 4 00:52:54.186444 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 00:52:54.195949 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 00:52:54.228616 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 4 00:52:54.228627 systemd-tmpfiles[1503]: ACLs are not supported, ignoring. Sep 4 00:52:54.230947 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 00:52:54.232164 kernel: loop4: detected capacity change from 0 to 146240 Sep 4 00:52:54.263642 kernel: loop5: detected capacity change from 0 to 8 Sep 4 00:52:54.263686 kernel: loop6: detected capacity change from 0 to 113872 Sep 4 00:52:54.280152 kernel: loop7: detected capacity change from 0 to 221472 Sep 4 00:52:54.298826 (sd-merge)[1506]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 4 00:52:54.299089 (sd-merge)[1506]: Merged extensions into '/usr'. Sep 4 00:52:54.302006 systemd[1]: Reload requested from client PID 1482 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 00:52:54.302014 systemd[1]: Reloading... Sep 4 00:52:54.312950 ldconfig[1476]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 00:52:54.327125 zram_generator::config[1532]: No configuration found. Sep 4 00:52:54.385783 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:52:54.446633 systemd[1]: Reloading finished in 144 ms. Sep 4 00:52:54.473054 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 00:52:54.482758 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 00:52:54.493701 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 00:52:54.528106 systemd[1]: Starting ensure-sysext.service... Sep 4 00:52:54.535928 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 00:52:54.548069 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 00:52:54.555510 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 00:52:54.555530 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 00:52:54.555695 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 00:52:54.555865 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 00:52:54.556402 systemd-tmpfiles[1591]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 00:52:54.556587 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Sep 4 00:52:54.556626 systemd-tmpfiles[1591]: ACLs are not supported, ignoring. Sep 4 00:52:54.559122 systemd-tmpfiles[1591]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 00:52:54.559127 systemd-tmpfiles[1591]: Skipping /boot Sep 4 00:52:54.562878 systemd[1]: Reload requested from client PID 1590 ('systemctl') (unit ensure-sysext.service)... Sep 4 00:52:54.562888 systemd[1]: Reloading... Sep 4 00:52:54.565713 systemd-tmpfiles[1591]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 00:52:54.565719 systemd-tmpfiles[1591]: Skipping /boot Sep 4 00:52:54.577340 systemd-udevd[1592]: Using default interface naming scheme 'v255'. Sep 4 00:52:54.591192 zram_generator::config[1619]: No configuration found. Sep 4 00:52:54.651125 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.653582 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 4 00:52:54.659117 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 4 00:52:54.671980 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 4 00:52:54.674095 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.675305 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 00:52:54.675322 kernel: ACPI: button: Sleep Button [SLPB] Sep 4 00:52:54.665173 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:52:54.683121 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.683297 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 00:52:54.702130 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.712132 kernel: ACPI: button: Power Button [PWRF] Sep 4 00:52:54.722121 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.730119 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.753226 kernel: IPMI message handler: version 39.2 Sep 4 00:52:54.776742 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 4 00:52:54.777448 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 4 00:52:54.777643 kernel: ipmi device interface Sep 4 00:52:54.777669 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.796149 kernel: iTCO_vendor_support: vendor-support=0 Sep 4 00:52:54.796195 kernel: MACsec IEEE 802.1AE Sep 4 00:52:54.799117 kernel: ipmi_si: IPMI System Interface driver Sep 4 00:52:54.799732 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 4 00:52:54.800220 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 4 00:52:54.810297 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 4 00:52:54.818927 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 4 00:52:54.826209 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 4 00:52:54.833579 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 4 00:52:54.833733 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.833828 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 4 00:52:54.858930 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 4 00:52:54.866011 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 4 00:52:54.874731 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.874830 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 4 00:52:54.886289 systemd[1]: Reloading finished in 323 ms. Sep 4 00:52:54.907407 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Sep 4 00:52:54.907635 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Sep 4 00:52:54.908120 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.919830 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 00:52:54.950514 kernel: intel_rapl_common: Found RAPL domain package Sep 4 00:52:54.950562 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.950692 kernel: intel_rapl_common: Found RAPL domain core Sep 4 00:52:54.961500 kernel: intel_rapl_common: Found RAPL domain dram Sep 4 00:52:54.961522 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 4 00:52:54.973117 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.975305 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 00:52:54.982119 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:54.990115 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Sep 4 00:52:55.008156 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:55.026120 systemd[1]: Finished ensure-sysext.service. Sep 4 00:52:55.051962 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 4 00:52:55.060208 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:52:55.060905 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 00:52:55.077117 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 4 00:52:55.077533 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 00:52:55.086118 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 4 00:52:55.092317 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 00:52:55.097000 augenrules[1813]: No rules Sep 4 00:52:55.109655 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 00:52:55.118762 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 00:52:55.127738 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 00:52:55.137699 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 00:52:55.146231 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 00:52:55.146747 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 00:52:55.156186 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 00:52:55.156757 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 00:52:55.167092 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 00:52:55.168053 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 00:52:55.168905 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 00:52:55.191880 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 00:52:55.209422 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 00:52:55.218222 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 00:52:55.219197 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 00:52:55.227341 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 00:52:55.240141 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 00:52:55.250837 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 00:52:55.251296 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 00:52:55.252082 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 00:52:55.252497 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 00:52:55.253545 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 00:52:55.253979 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 00:52:55.254677 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 00:52:55.255154 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 00:52:55.255897 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 00:52:55.256273 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 00:52:55.260373 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 00:52:55.260459 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 00:52:55.261240 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 00:52:55.262093 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 00:52:55.262221 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 00:52:55.262427 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 00:52:55.277677 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 00:52:55.294590 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 00:52:55.337138 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 00:52:55.339844 systemd-resolved[1826]: Positive Trust Anchors: Sep 4 00:52:55.339850 systemd-resolved[1826]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 00:52:55.339876 systemd-resolved[1826]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 00:52:55.342372 systemd-resolved[1826]: Using system hostname 'ci-4372.1.0-n-fd36784ab7'. Sep 4 00:52:55.344781 systemd-networkd[1825]: lo: Link UP Sep 4 00:52:55.344784 systemd-networkd[1825]: lo: Gained carrier Sep 4 00:52:55.347276 systemd-networkd[1825]: bond0: netdev ready Sep 4 00:52:55.348318 systemd-networkd[1825]: Enumeration completed Sep 4 00:52:55.352922 systemd-networkd[1825]: enp1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:97:f8:2c.network. Sep 4 00:52:55.354538 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 00:52:55.363191 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 00:52:55.373332 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 00:52:55.384273 systemd[1]: Reached target network.target - Network. Sep 4 00:52:55.391208 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 00:52:55.401218 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 00:52:55.410329 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 00:52:55.420302 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 00:52:55.430322 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 00:52:55.440394 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 00:52:55.450346 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 00:52:55.450452 systemd[1]: Reached target paths.target - Path Units. Sep 4 00:52:55.459366 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 00:52:55.468780 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 00:52:55.477607 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 00:52:55.487189 systemd[1]: Reached target timers.target - Timer Units. Sep 4 00:52:55.494932 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 00:52:55.505908 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 00:52:55.514365 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 00:52:55.525294 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 00:52:55.534406 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 00:52:55.545071 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 00:52:55.556048 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 00:52:55.566713 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 00:52:55.576422 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 00:52:55.585621 systemd[1]: Reached target basic.target - Basic System. Sep 4 00:52:55.593320 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 00:52:55.593341 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 00:52:55.594057 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 00:52:55.603301 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 4 00:52:55.613127 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 00:52:55.626159 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 4 00:52:55.626164 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 00:52:55.639163 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Sep 4 00:52:55.640522 systemd-networkd[1825]: enp1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:97:f8:2d.network. Sep 4 00:52:55.641554 coreos-metadata[1865]: Sep 04 00:52:55.641 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:52:55.643333 coreos-metadata[1865]: Sep 04 00:52:55.643 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Sep 4 00:52:55.652468 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 00:52:55.675646 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 00:52:55.680740 jq[1871]: false Sep 4 00:52:55.684217 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 00:52:55.684855 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 00:52:55.688978 extend-filesystems[1872]: Found /dev/sdb6 Sep 4 00:52:55.705719 extend-filesystems[1872]: Found /dev/sdb9 Sep 4 00:52:55.705719 extend-filesystems[1872]: Checking size of /dev/sdb9 Sep 4 00:52:55.705719 extend-filesystems[1872]: Resized partition /dev/sdb9 Sep 4 00:52:55.718347 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Sep 4 00:52:55.696832 oslogin_cache_refresh[1873]: Refreshing passwd entry cache Sep 4 00:52:55.693870 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 00:52:55.718516 extend-filesystems[1884]: resize2fs 1.47.2 (1-Jan-2025) Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Refreshing passwd entry cache Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Failure getting users, quitting Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Refreshing group entry cache Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Failure getting groups, quitting Sep 4 00:52:55.738374 google_oslogin_nss_cache[1873]: oslogin_cache_refresh[1873]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 00:52:55.697971 oslogin_cache_refresh[1873]: Failure getting users, quitting Sep 4 00:52:55.706858 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 00:52:55.697977 oslogin_cache_refresh[1873]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 00:52:55.718995 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 00:52:55.697994 oslogin_cache_refresh[1873]: Refreshing group entry cache Sep 4 00:52:55.698312 oslogin_cache_refresh[1873]: Failure getting groups, quitting Sep 4 00:52:55.698316 oslogin_cache_refresh[1873]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 00:52:55.750807 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 00:52:55.762672 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 00:52:55.772231 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 4 00:52:55.785119 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 4 00:52:55.790304 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 00:52:55.790659 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 00:52:55.795836 systemd-networkd[1825]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 4 00:52:55.796117 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Sep 4 00:52:55.796350 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 00:52:55.796966 systemd-networkd[1825]: enp1s0f0np0: Link UP Sep 4 00:52:55.797121 systemd-networkd[1825]: enp1s0f0np0: Gained carrier Sep 4 00:52:55.806377 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 4 00:52:55.812281 systemd-networkd[1825]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:97:f8:2c.network. Sep 4 00:52:55.812430 systemd-networkd[1825]: enp1s0f1np1: Link UP Sep 4 00:52:55.812564 systemd-networkd[1825]: enp1s0f1np1: Gained carrier Sep 4 00:52:55.822253 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 00:52:55.823941 jq[1904]: true Sep 4 00:52:55.824223 systemd-networkd[1825]: bond0: Link UP Sep 4 00:52:55.824367 systemd-networkd[1825]: bond0: Gained carrier Sep 4 00:52:55.824462 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:55.824752 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:55.824909 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:55.824992 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:55.829304 update_engine[1903]: I20250904 00:52:55.829236 1903 main.cc:92] Flatcar Update Engine starting Sep 4 00:52:55.830066 systemd-logind[1901]: Watching system buttons on /dev/input/event3 (Power Button) Sep 4 00:52:55.830077 systemd-logind[1901]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 4 00:52:55.830087 systemd-logind[1901]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 4 00:52:55.830292 systemd-logind[1901]: New seat seat0. Sep 4 00:52:55.832422 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 00:52:55.841724 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 00:52:55.851330 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 00:52:55.851437 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 00:52:55.851583 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 00:52:55.859253 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 00:52:55.868378 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 00:52:55.868490 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 00:52:55.877709 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 00:52:55.877819 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 00:52:55.907397 sshd_keygen[1900]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 00:52:55.913485 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Sep 4 00:52:55.913507 kernel: bond0: active interface up! Sep 4 00:52:55.916167 jq[1909]: true Sep 4 00:52:55.916583 (ntainerd)[1910]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 00:52:55.921516 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 00:52:55.935782 tar[1907]: linux-amd64/helm Sep 4 00:52:55.944266 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 4 00:52:55.944428 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 4 00:52:55.947355 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 00:52:55.959970 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 00:52:55.960081 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 00:52:55.966737 bash[1943]: Updated "/home/core/.ssh/authorized_keys" Sep 4 00:52:55.968447 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 00:52:55.968822 dbus-daemon[1866]: [system] SELinux support is enabled Sep 4 00:52:55.970648 update_engine[1903]: I20250904 00:52:55.970592 1903 update_check_scheduler.cc:74] Next update check in 3m0s Sep 4 00:52:55.978241 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 00:52:55.990173 dbus-daemon[1866]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 4 00:52:55.990555 systemd[1]: Starting sshkeys.service... Sep 4 00:52:55.997199 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 00:52:55.997219 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 00:52:56.012770 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 00:52:56.021228 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 00:52:56.021246 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 00:52:56.032119 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Sep 4 00:52:56.043086 systemd[1]: Started update-engine.service - Update Engine. Sep 4 00:52:56.051568 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 00:52:56.064269 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 4 00:52:56.075237 containerd[1910]: time="2025-09-04T00:52:56Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 00:52:56.075231 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 4 00:52:56.076244 containerd[1910]: time="2025-09-04T00:52:56.076200215Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 4 00:52:56.081248 containerd[1910]: time="2025-09-04T00:52:56.081226826Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.571µs" Sep 4 00:52:56.081248 containerd[1910]: time="2025-09-04T00:52:56.081244639Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 00:52:56.081326 containerd[1910]: time="2025-09-04T00:52:56.081255172Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 00:52:56.081351 containerd[1910]: time="2025-09-04T00:52:56.081331334Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 00:52:56.081351 containerd[1910]: time="2025-09-04T00:52:56.081340362Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 00:52:56.081392 containerd[1910]: time="2025-09-04T00:52:56.081353006Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081392 containerd[1910]: time="2025-09-04T00:52:56.081385454Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081440 containerd[1910]: time="2025-09-04T00:52:56.081392770Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081526 containerd[1910]: time="2025-09-04T00:52:56.081515361Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081526 containerd[1910]: time="2025-09-04T00:52:56.081523760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081576 containerd[1910]: time="2025-09-04T00:52:56.081529509Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081576 containerd[1910]: time="2025-09-04T00:52:56.081535847Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081625 containerd[1910]: time="2025-09-04T00:52:56.081574854Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081720 containerd[1910]: time="2025-09-04T00:52:56.081710767Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081745 containerd[1910]: time="2025-09-04T00:52:56.081726432Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 00:52:56.081745 containerd[1910]: time="2025-09-04T00:52:56.081732405Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 00:52:56.081787 containerd[1910]: time="2025-09-04T00:52:56.081749155Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 00:52:56.081878 containerd[1910]: time="2025-09-04T00:52:56.081870615Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 00:52:56.081909 containerd[1910]: time="2025-09-04T00:52:56.081902245Z" level=info msg="metadata content store policy set" policy=shared Sep 4 00:52:56.091950 containerd[1910]: time="2025-09-04T00:52:56.091935583Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 00:52:56.091984 containerd[1910]: time="2025-09-04T00:52:56.091961548Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 00:52:56.092011 containerd[1910]: time="2025-09-04T00:52:56.091984291Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 00:52:56.092011 containerd[1910]: time="2025-09-04T00:52:56.091995825Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 00:52:56.092059 containerd[1910]: time="2025-09-04T00:52:56.092007819Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 00:52:56.092059 containerd[1910]: time="2025-09-04T00:52:56.092026602Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 00:52:56.092059 containerd[1910]: time="2025-09-04T00:52:56.092039309Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 00:52:56.092059 containerd[1910]: time="2025-09-04T00:52:56.092050673Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 00:52:56.092163 containerd[1910]: time="2025-09-04T00:52:56.092066488Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 00:52:56.092163 containerd[1910]: time="2025-09-04T00:52:56.092078203Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 00:52:56.092163 containerd[1910]: time="2025-09-04T00:52:56.092089668Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 00:52:56.092163 containerd[1910]: time="2025-09-04T00:52:56.092101838Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 00:52:56.092250 containerd[1910]: time="2025-09-04T00:52:56.092202034Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 00:52:56.092250 containerd[1910]: time="2025-09-04T00:52:56.092219140Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 00:52:56.092250 containerd[1910]: time="2025-09-04T00:52:56.092236561Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092254811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092265723Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092275612Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092292669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092303926Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 00:52:56.092321 containerd[1910]: time="2025-09-04T00:52:56.092314666Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 00:52:56.092453 containerd[1910]: time="2025-09-04T00:52:56.092324321Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 00:52:56.092453 containerd[1910]: time="2025-09-04T00:52:56.092340234Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 00:52:56.092453 containerd[1910]: time="2025-09-04T00:52:56.092391739Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 00:52:56.092453 containerd[1910]: time="2025-09-04T00:52:56.092404173Z" level=info msg="Start snapshots syncer" Sep 4 00:52:56.092453 containerd[1910]: time="2025-09-04T00:52:56.092428450Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 00:52:56.092647 containerd[1910]: time="2025-09-04T00:52:56.092625110Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 00:52:56.092734 containerd[1910]: time="2025-09-04T00:52:56.092664927Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 00:52:56.092734 containerd[1910]: time="2025-09-04T00:52:56.092726387Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 00:52:56.092815 containerd[1910]: time="2025-09-04T00:52:56.092804907Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 00:52:56.092843 containerd[1910]: time="2025-09-04T00:52:56.092821131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 00:52:56.092843 containerd[1910]: time="2025-09-04T00:52:56.092832484Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 00:52:56.092892 containerd[1910]: time="2025-09-04T00:52:56.092850037Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 00:52:56.092892 containerd[1910]: time="2025-09-04T00:52:56.092861834Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 00:52:56.092892 containerd[1910]: time="2025-09-04T00:52:56.092872165Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 00:52:56.092963 containerd[1910]: time="2025-09-04T00:52:56.092889134Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 00:52:56.092963 containerd[1910]: time="2025-09-04T00:52:56.092913746Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 00:52:56.092963 containerd[1910]: time="2025-09-04T00:52:56.092925490Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 00:52:56.092963 containerd[1910]: time="2025-09-04T00:52:56.092942526Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.092968104Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.092987770Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.092996645Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.093005563Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.093012851Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.093028464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 00:52:56.093051 containerd[1910]: time="2025-09-04T00:52:56.093043682Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 00:52:56.093212 containerd[1910]: time="2025-09-04T00:52:56.093057548Z" level=info msg="runtime interface created" Sep 4 00:52:56.093212 containerd[1910]: time="2025-09-04T00:52:56.093062967Z" level=info msg="created NRI interface" Sep 4 00:52:56.093212 containerd[1910]: time="2025-09-04T00:52:56.093077589Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 00:52:56.093212 containerd[1910]: time="2025-09-04T00:52:56.093088237Z" level=info msg="Connect containerd service" Sep 4 00:52:56.093212 containerd[1910]: time="2025-09-04T00:52:56.093107975Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 00:52:56.093694 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 00:52:56.094378 containerd[1910]: time="2025-09-04T00:52:56.094358033Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 00:52:56.102031 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 4 00:52:56.104224 coreos-metadata[1970]: Sep 04 00:52:56.104 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 4 00:52:56.112425 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 00:52:56.122290 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 00:52:56.167343 locksmithd[1986]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 00:52:56.185353 tar[1907]: linux-amd64/LICENSE Sep 4 00:52:56.185417 tar[1907]: linux-amd64/README.md Sep 4 00:52:56.205933 containerd[1910]: time="2025-09-04T00:52:56.205908420Z" level=info msg="Start subscribing containerd event" Sep 4 00:52:56.205991 containerd[1910]: time="2025-09-04T00:52:56.205941059Z" level=info msg="Start recovering state" Sep 4 00:52:56.205991 containerd[1910]: time="2025-09-04T00:52:56.205970353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 00:52:56.206022 containerd[1910]: time="2025-09-04T00:52:56.205995912Z" level=info msg="Start event monitor" Sep 4 00:52:56.206022 containerd[1910]: time="2025-09-04T00:52:56.205997361Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 00:52:56.206022 containerd[1910]: time="2025-09-04T00:52:56.206006431Z" level=info msg="Start cni network conf syncer for default" Sep 4 00:52:56.206022 containerd[1910]: time="2025-09-04T00:52:56.206010591Z" level=info msg="Start streaming server" Sep 4 00:52:56.206022 containerd[1910]: time="2025-09-04T00:52:56.206018725Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 00:52:56.206101 containerd[1910]: time="2025-09-04T00:52:56.206023097Z" level=info msg="runtime interface starting up..." Sep 4 00:52:56.206101 containerd[1910]: time="2025-09-04T00:52:56.206026399Z" level=info msg="starting plugins..." Sep 4 00:52:56.206101 containerd[1910]: time="2025-09-04T00:52:56.206034844Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 00:52:56.206101 containerd[1910]: time="2025-09-04T00:52:56.206098831Z" level=info msg="containerd successfully booted in 0.131089s" Sep 4 00:52:56.206145 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 00:52:56.215808 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 00:52:56.254116 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Sep 4 00:52:56.279483 extend-filesystems[1884]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Sep 4 00:52:56.279483 extend-filesystems[1884]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 4 00:52:56.279483 extend-filesystems[1884]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Sep 4 00:52:56.306306 extend-filesystems[1872]: Resized filesystem in /dev/sdb9 Sep 4 00:52:56.279956 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 00:52:56.280070 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 00:52:56.643477 coreos-metadata[1865]: Sep 04 00:52:56.643 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Sep 4 00:52:57.284502 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:57.348164 systemd-networkd[1825]: bond0: Gained IPv6LL Sep 4 00:52:57.348356 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:52:57.349348 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 00:52:57.359984 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 00:52:57.369444 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:52:57.387468 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 00:52:57.425384 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 00:52:58.124523 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:52:58.134643 (kubelet)[2023]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:52:58.458328 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Sep 4 00:52:58.458513 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Sep 4 00:52:58.531119 kernel: sdhci-pci 0000:00:14.5: SDHCI controller found [8086:a375] (rev 10) Sep 4 00:52:58.569017 kubelet[2023]: E0904 00:52:58.568974 2023 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:52:58.570438 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:52:58.570526 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:52:58.570710 systemd[1]: kubelet.service: Consumed 597ms CPU time, 270.4M memory peak. Sep 4 00:52:59.808474 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 00:52:59.818869 systemd[1]: Started sshd@0-147.28.180.77:22-147.75.109.163:55690.service - OpenSSH per-connection server daemon (147.75.109.163:55690). Sep 4 00:52:59.901832 sshd[2044]: Accepted publickey for core from 147.75.109.163 port 55690 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:52:59.902716 sshd-session[2044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:52:59.909679 systemd-logind[1901]: New session 1 of user core. Sep 4 00:52:59.910431 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 00:52:59.920039 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 00:52:59.944971 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 00:52:59.957868 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 00:52:59.982931 (systemd)[2048]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 00:52:59.990210 systemd-logind[1901]: New session c1 of user core. Sep 4 00:53:00.140920 systemd[2048]: Queued start job for default target default.target. Sep 4 00:53:00.152781 systemd[2048]: Created slice app.slice - User Application Slice. Sep 4 00:53:00.152814 systemd[2048]: Reached target paths.target - Paths. Sep 4 00:53:00.152841 systemd[2048]: Reached target timers.target - Timers. Sep 4 00:53:00.153481 systemd[2048]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 00:53:00.158680 systemd[2048]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 00:53:00.158737 systemd[2048]: Reached target sockets.target - Sockets. Sep 4 00:53:00.158759 systemd[2048]: Reached target basic.target - Basic System. Sep 4 00:53:00.158780 systemd[2048]: Reached target default.target - Main User Target. Sep 4 00:53:00.158794 systemd[2048]: Startup finished in 151ms. Sep 4 00:53:00.158852 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 00:53:00.168306 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 00:53:00.233864 systemd[1]: Started sshd@1-147.28.180.77:22-147.75.109.163:43122.service - OpenSSH per-connection server daemon (147.75.109.163:43122). Sep 4 00:53:00.280563 sshd[2059]: Accepted publickey for core from 147.75.109.163 port 43122 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:00.281184 sshd-session[2059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:00.283891 systemd-logind[1901]: New session 2 of user core. Sep 4 00:53:00.298947 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 00:53:00.367417 sshd[2061]: Connection closed by 147.75.109.163 port 43122 Sep 4 00:53:00.367556 sshd-session[2059]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:00.380302 systemd[1]: sshd@1-147.28.180.77:22-147.75.109.163:43122.service: Deactivated successfully. Sep 4 00:53:00.381137 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 00:53:00.381714 systemd-logind[1901]: Session 2 logged out. Waiting for processes to exit. Sep 4 00:53:00.382839 systemd[1]: Started sshd@2-147.28.180.77:22-147.75.109.163:43124.service - OpenSSH per-connection server daemon (147.75.109.163:43124). Sep 4 00:53:00.394299 systemd-logind[1901]: Removed session 2. Sep 4 00:53:00.447336 sshd[2067]: Accepted publickey for core from 147.75.109.163 port 43124 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:00.450575 sshd-session[2067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:00.463038 systemd-logind[1901]: New session 3 of user core. Sep 4 00:53:00.478699 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 00:53:00.546421 sshd[2069]: Connection closed by 147.75.109.163 port 43124 Sep 4 00:53:00.546565 sshd-session[2067]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:00.548001 systemd[1]: sshd@2-147.28.180.77:22-147.75.109.163:43124.service: Deactivated successfully. Sep 4 00:53:00.548915 systemd[1]: session-3.scope: Deactivated successfully. Sep 4 00:53:00.549835 systemd-logind[1901]: Session 3 logged out. Waiting for processes to exit. Sep 4 00:53:00.550512 systemd-logind[1901]: Removed session 3. Sep 4 00:53:01.134311 login[1980]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 00:53:01.138723 systemd-logind[1901]: New session 4 of user core. Sep 4 00:53:01.139591 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 00:53:01.140062 login[1975]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 00:53:01.142703 systemd-logind[1901]: New session 5 of user core. Sep 4 00:53:01.143470 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 00:53:02.313337 coreos-metadata[1865]: Sep 04 00:53:02.313 INFO Fetch successful Sep 4 00:53:02.368021 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 4 00:53:02.369293 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 4 00:53:02.369876 systemd[1]: Started sshd@3-147.28.180.77:22-183.23.62.16:30582.service - OpenSSH per-connection server daemon (183.23.62.16:30582). Sep 4 00:53:02.488622 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:53:02.685825 coreos-metadata[1970]: Sep 04 00:53:02.685 INFO Fetch successful Sep 4 00:53:02.772101 unknown[1970]: wrote ssh authorized keys file for user: core Sep 4 00:53:02.795552 update-ssh-keys[2108]: Updated "/home/core/.ssh/authorized_keys" Sep 4 00:53:02.795927 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 4 00:53:02.796592 systemd[1]: Finished sshkeys.service. Sep 4 00:53:02.935683 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 4 00:53:02.936975 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 00:53:02.937360 systemd[1]: Startup finished in 4.294s (kernel) + 23.607s (initrd) + 11.111s (userspace) = 39.013s. Sep 4 00:53:08.823104 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 00:53:08.824457 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:09.115061 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:09.117468 (kubelet)[2121]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:53:09.136952 kubelet[2121]: E0904 00:53:09.136928 2121 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:53:09.138965 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:53:09.139049 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:53:09.139225 systemd[1]: kubelet.service: Consumed 142ms CPU time, 111.6M memory peak. Sep 4 00:53:10.566755 systemd[1]: Started sshd@4-147.28.180.77:22-147.75.109.163:46102.service - OpenSSH per-connection server daemon (147.75.109.163:46102). Sep 4 00:53:10.612223 sshd[2136]: Accepted publickey for core from 147.75.109.163 port 46102 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:10.612939 sshd-session[2136]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:10.616075 systemd-logind[1901]: New session 6 of user core. Sep 4 00:53:10.626377 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 00:53:10.679233 sshd[2138]: Connection closed by 147.75.109.163 port 46102 Sep 4 00:53:10.679423 sshd-session[2136]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:10.691238 systemd[1]: sshd@4-147.28.180.77:22-147.75.109.163:46102.service: Deactivated successfully. Sep 4 00:53:10.692102 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 00:53:10.692626 systemd-logind[1901]: Session 6 logged out. Waiting for processes to exit. Sep 4 00:53:10.693738 systemd[1]: Started sshd@5-147.28.180.77:22-147.75.109.163:46116.service - OpenSSH per-connection server daemon (147.75.109.163:46116). Sep 4 00:53:10.694232 systemd-logind[1901]: Removed session 6. Sep 4 00:53:10.725434 sshd[2144]: Accepted publickey for core from 147.75.109.163 port 46116 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:10.726150 sshd-session[2144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:10.728968 systemd-logind[1901]: New session 7 of user core. Sep 4 00:53:10.743368 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 00:53:10.793122 sshd[2147]: Connection closed by 147.75.109.163 port 46116 Sep 4 00:53:10.793256 sshd-session[2144]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:10.803205 systemd[1]: sshd@5-147.28.180.77:22-147.75.109.163:46116.service: Deactivated successfully. Sep 4 00:53:10.803972 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 00:53:10.804410 systemd-logind[1901]: Session 7 logged out. Waiting for processes to exit. Sep 4 00:53:10.805732 systemd[1]: Started sshd@6-147.28.180.77:22-147.75.109.163:46128.service - OpenSSH per-connection server daemon (147.75.109.163:46128). Sep 4 00:53:10.806090 systemd-logind[1901]: Removed session 7. Sep 4 00:53:10.858130 sshd[2153]: Accepted publickey for core from 147.75.109.163 port 46128 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:10.861365 sshd-session[2153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:10.874179 systemd-logind[1901]: New session 8 of user core. Sep 4 00:53:10.887569 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 00:53:10.955846 sshd[2155]: Connection closed by 147.75.109.163 port 46128 Sep 4 00:53:10.956524 sshd-session[2153]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:10.980499 systemd[1]: sshd@6-147.28.180.77:22-147.75.109.163:46128.service: Deactivated successfully. Sep 4 00:53:10.984208 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 00:53:10.986421 systemd-logind[1901]: Session 8 logged out. Waiting for processes to exit. Sep 4 00:53:10.992196 systemd[1]: Started sshd@7-147.28.180.77:22-147.75.109.163:46142.service - OpenSSH per-connection server daemon (147.75.109.163:46142). Sep 4 00:53:10.993906 systemd-logind[1901]: Removed session 8. Sep 4 00:53:11.075121 sshd[2161]: Accepted publickey for core from 147.75.109.163 port 46142 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:11.076244 sshd-session[2161]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:11.080805 systemd-logind[1901]: New session 9 of user core. Sep 4 00:53:11.092608 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 00:53:11.161353 sudo[2165]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 00:53:11.161523 sudo[2165]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:53:11.176726 sudo[2165]: pam_unix(sudo:session): session closed for user root Sep 4 00:53:11.177552 sshd[2164]: Connection closed by 147.75.109.163 port 46142 Sep 4 00:53:11.177714 sshd-session[2161]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:11.189871 systemd[1]: sshd@7-147.28.180.77:22-147.75.109.163:46142.service: Deactivated successfully. Sep 4 00:53:11.190925 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 00:53:11.191570 systemd-logind[1901]: Session 9 logged out. Waiting for processes to exit. Sep 4 00:53:11.193143 systemd[1]: Started sshd@8-147.28.180.77:22-147.75.109.163:46146.service - OpenSSH per-connection server daemon (147.75.109.163:46146). Sep 4 00:53:11.193993 systemd-logind[1901]: Removed session 9. Sep 4 00:53:11.240251 sshd[2171]: Accepted publickey for core from 147.75.109.163 port 46146 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:11.240938 sshd-session[2171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:11.243707 systemd-logind[1901]: New session 10 of user core. Sep 4 00:53:11.251391 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 00:53:11.306737 sudo[2176]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 00:53:11.306878 sudo[2176]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:53:11.309680 sudo[2176]: pam_unix(sudo:session): session closed for user root Sep 4 00:53:11.312251 sudo[2175]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 00:53:11.312386 sudo[2175]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:53:11.317926 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 00:53:11.352409 augenrules[2198]: No rules Sep 4 00:53:11.353157 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 00:53:11.353399 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 00:53:11.354324 sudo[2175]: pam_unix(sudo:session): session closed for user root Sep 4 00:53:11.355643 sshd[2174]: Connection closed by 147.75.109.163 port 46146 Sep 4 00:53:11.355992 sshd-session[2171]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:11.374901 systemd[1]: sshd@8-147.28.180.77:22-147.75.109.163:46146.service: Deactivated successfully. Sep 4 00:53:11.378299 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 00:53:11.380343 systemd-logind[1901]: Session 10 logged out. Waiting for processes to exit. Sep 4 00:53:11.385921 systemd[1]: Started sshd@9-147.28.180.77:22-147.75.109.163:46154.service - OpenSSH per-connection server daemon (147.75.109.163:46154). Sep 4 00:53:11.387515 systemd-logind[1901]: Removed session 10. Sep 4 00:53:11.471845 sshd[2207]: Accepted publickey for core from 147.75.109.163 port 46154 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 00:53:11.472888 sshd-session[2207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 00:53:11.477239 systemd-logind[1901]: New session 11 of user core. Sep 4 00:53:11.489378 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 00:53:11.551942 sudo[2210]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 00:53:11.552725 sudo[2210]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 00:53:11.920922 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 00:53:11.938494 (dockerd)[2236]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 00:53:12.126652 dockerd[2236]: time="2025-09-04T00:53:12.126616157Z" level=info msg="Starting up" Sep 4 00:53:12.127424 dockerd[2236]: time="2025-09-04T00:53:12.127413478Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 00:53:12.153561 dockerd[2236]: time="2025-09-04T00:53:12.153515617Z" level=info msg="Loading containers: start." Sep 4 00:53:12.165158 kernel: Initializing XFRM netlink socket Sep 4 00:53:12.301417 systemd-timesyncd[1827]: Network configuration changed, trying to establish connection. Sep 4 00:53:12.320709 systemd-networkd[1825]: docker0: Link UP Sep 4 00:53:12.321975 dockerd[2236]: time="2025-09-04T00:53:12.321931761Z" level=info msg="Loading containers: done." Sep 4 00:53:12.328771 dockerd[2236]: time="2025-09-04T00:53:12.328752724Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 00:53:12.328842 dockerd[2236]: time="2025-09-04T00:53:12.328792567Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 4 00:53:12.328861 dockerd[2236]: time="2025-09-04T00:53:12.328840354Z" level=info msg="Initializing buildkit" Sep 4 00:53:12.339275 dockerd[2236]: time="2025-09-04T00:53:12.339234753Z" level=info msg="Completed buildkit initialization" Sep 4 00:53:12.342346 dockerd[2236]: time="2025-09-04T00:53:12.342333579Z" level=info msg="Daemon has completed initialization" Sep 4 00:53:12.342383 dockerd[2236]: time="2025-09-04T00:53:12.342365909Z" level=info msg="API listen on /run/docker.sock" Sep 4 00:53:12.342500 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 00:53:12.510391 systemd-timesyncd[1827]: Contacted time server [2604:a880:1:20::17:5001]:123 (2.flatcar.pool.ntp.org). Sep 4 00:53:12.510438 systemd-timesyncd[1827]: Initial clock synchronization to Thu 2025-09-04 00:53:12.482283 UTC. Sep 4 00:53:13.161010 containerd[1910]: time="2025-09-04T00:53:13.160906045Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 4 00:53:13.799581 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1953119388.mount: Deactivated successfully. Sep 4 00:53:14.538122 containerd[1910]: time="2025-09-04T00:53:14.538089325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:14.538335 containerd[1910]: time="2025-09-04T00:53:14.538278870Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 4 00:53:14.538630 containerd[1910]: time="2025-09-04T00:53:14.538616419Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:14.539868 containerd[1910]: time="2025-09-04T00:53:14.539855315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:14.540386 containerd[1910]: time="2025-09-04T00:53:14.540373405Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 1.379395511s" Sep 4 00:53:14.540413 containerd[1910]: time="2025-09-04T00:53:14.540391579Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 4 00:53:14.540689 containerd[1910]: time="2025-09-04T00:53:14.540680120Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 4 00:53:15.533695 containerd[1910]: time="2025-09-04T00:53:15.533668556Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:15.533870 containerd[1910]: time="2025-09-04T00:53:15.533854990Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 4 00:53:15.534207 containerd[1910]: time="2025-09-04T00:53:15.534193649Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:15.535813 containerd[1910]: time="2025-09-04T00:53:15.535770638Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:15.536232 containerd[1910]: time="2025-09-04T00:53:15.536188924Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 995.493692ms" Sep 4 00:53:15.536232 containerd[1910]: time="2025-09-04T00:53:15.536205717Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 4 00:53:15.536472 containerd[1910]: time="2025-09-04T00:53:15.536426606Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 4 00:53:16.496190 containerd[1910]: time="2025-09-04T00:53:16.496147406Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:16.496398 containerd[1910]: time="2025-09-04T00:53:16.496366083Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 4 00:53:16.496629 containerd[1910]: time="2025-09-04T00:53:16.496620193Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:16.497887 containerd[1910]: time="2025-09-04T00:53:16.497875765Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:16.498459 containerd[1910]: time="2025-09-04T00:53:16.498446963Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 962.00552ms" Sep 4 00:53:16.498499 containerd[1910]: time="2025-09-04T00:53:16.498461258Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 4 00:53:16.498779 containerd[1910]: time="2025-09-04T00:53:16.498767528Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 4 00:53:17.398009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3794309059.mount: Deactivated successfully. Sep 4 00:53:17.589325 containerd[1910]: time="2025-09-04T00:53:17.589295305Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:17.589531 containerd[1910]: time="2025-09-04T00:53:17.589442726Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 4 00:53:17.589751 containerd[1910]: time="2025-09-04T00:53:17.589738629Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:17.590754 containerd[1910]: time="2025-09-04T00:53:17.590719740Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:17.590932 containerd[1910]: time="2025-09-04T00:53:17.590896740Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 1.09211291s" Sep 4 00:53:17.590932 containerd[1910]: time="2025-09-04T00:53:17.590912182Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 4 00:53:17.591143 containerd[1910]: time="2025-09-04T00:53:17.591134186Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 4 00:53:18.132536 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3515984397.mount: Deactivated successfully. Sep 4 00:53:18.650288 containerd[1910]: time="2025-09-04T00:53:18.650264829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:18.650530 containerd[1910]: time="2025-09-04T00:53:18.650517673Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 4 00:53:18.650929 containerd[1910]: time="2025-09-04T00:53:18.650915260Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:18.652275 containerd[1910]: time="2025-09-04T00:53:18.652261282Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:18.653220 containerd[1910]: time="2025-09-04T00:53:18.653204953Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.062055903s" Sep 4 00:53:18.653253 containerd[1910]: time="2025-09-04T00:53:18.653219843Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 4 00:53:18.653463 containerd[1910]: time="2025-09-04T00:53:18.653453635Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 00:53:19.190465 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 00:53:19.191494 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:19.192663 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4200559562.mount: Deactivated successfully. Sep 4 00:53:19.194090 containerd[1910]: time="2025-09-04T00:53:19.194069837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:53:19.194348 containerd[1910]: time="2025-09-04T00:53:19.194336359Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 00:53:19.194722 containerd[1910]: time="2025-09-04T00:53:19.194708856Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:53:19.195662 containerd[1910]: time="2025-09-04T00:53:19.195650117Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 00:53:19.196048 containerd[1910]: time="2025-09-04T00:53:19.196036832Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 542.570073ms" Sep 4 00:53:19.196070 containerd[1910]: time="2025-09-04T00:53:19.196053337Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 00:53:19.196455 containerd[1910]: time="2025-09-04T00:53:19.196427688Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 4 00:53:19.476591 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:19.478538 (kubelet)[2597]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 00:53:19.498928 kubelet[2597]: E0904 00:53:19.498906 2597 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 00:53:19.500058 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 00:53:19.500148 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 00:53:19.500318 systemd[1]: kubelet.service: Consumed 129ms CPU time, 117M memory peak. Sep 4 00:53:19.696645 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1450190450.mount: Deactivated successfully. Sep 4 00:53:20.812280 containerd[1910]: time="2025-09-04T00:53:20.812256766Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:20.812489 containerd[1910]: time="2025-09-04T00:53:20.812430631Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 4 00:53:20.812813 containerd[1910]: time="2025-09-04T00:53:20.812801208Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:20.814103 containerd[1910]: time="2025-09-04T00:53:20.814091837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:20.814645 containerd[1910]: time="2025-09-04T00:53:20.814631950Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.618190115s" Sep 4 00:53:20.814668 containerd[1910]: time="2025-09-04T00:53:20.814647722Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 4 00:53:23.123387 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:23.123498 systemd[1]: kubelet.service: Consumed 129ms CPU time, 117M memory peak. Sep 4 00:53:23.124758 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:23.137219 systemd[1]: Reload requested from client PID 2722 ('systemctl') (unit session-11.scope)... Sep 4 00:53:23.137230 systemd[1]: Reloading... Sep 4 00:53:23.186214 zram_generator::config[2770]: No configuration found. Sep 4 00:53:23.243860 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:53:23.334062 systemd[1]: Reloading finished in 196 ms. Sep 4 00:53:23.364410 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 00:53:23.364602 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 00:53:23.365128 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:23.365232 systemd[1]: kubelet.service: Consumed 48ms CPU time, 87.6M memory peak. Sep 4 00:53:23.370075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:23.693605 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:23.695626 (kubelet)[2834]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 00:53:23.714848 kubelet[2834]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 00:53:23.714848 kubelet[2834]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 00:53:23.714848 kubelet[2834]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 00:53:23.715073 kubelet[2834]: I0904 00:53:23.714874 2834 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 00:53:23.903209 kubelet[2834]: I0904 00:53:23.903153 2834 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 00:53:23.903209 kubelet[2834]: I0904 00:53:23.903178 2834 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 00:53:23.903328 kubelet[2834]: I0904 00:53:23.903291 2834 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 00:53:23.919311 kubelet[2834]: E0904 00:53:23.919270 2834 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://147.28.180.77:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.28.180.77:6443: connect: connection refused" logger="UnhandledError" Sep 4 00:53:23.920958 kubelet[2834]: I0904 00:53:23.920925 2834 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 00:53:23.927387 kubelet[2834]: I0904 00:53:23.927379 2834 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 00:53:23.937525 kubelet[2834]: I0904 00:53:23.937468 2834 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 00:53:23.938453 kubelet[2834]: I0904 00:53:23.938416 2834 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 00:53:23.938492 kubelet[2834]: I0904 00:53:23.938478 2834 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 00:53:23.938615 kubelet[2834]: I0904 00:53:23.938493 2834 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-fd36784ab7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 00:53:23.938615 kubelet[2834]: I0904 00:53:23.938592 2834 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 00:53:23.938615 kubelet[2834]: I0904 00:53:23.938597 2834 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 00:53:23.938711 kubelet[2834]: I0904 00:53:23.938651 2834 state_mem.go:36] "Initialized new in-memory state store" Sep 4 00:53:23.940979 kubelet[2834]: I0904 00:53:23.940933 2834 kubelet.go:408] "Attempting to sync node with API server" Sep 4 00:53:23.940979 kubelet[2834]: I0904 00:53:23.940947 2834 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 00:53:23.940979 kubelet[2834]: I0904 00:53:23.940966 2834 kubelet.go:314] "Adding apiserver pod source" Sep 4 00:53:23.940979 kubelet[2834]: I0904 00:53:23.940976 2834 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 00:53:23.943338 kubelet[2834]: I0904 00:53:23.943272 2834 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 4 00:53:23.943689 kubelet[2834]: I0904 00:53:23.943631 2834 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 00:53:23.943689 kubelet[2834]: W0904 00:53:23.943624 2834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://147.28.180.77:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 147.28.180.77:6443: connect: connection refused Sep 4 00:53:23.943771 kubelet[2834]: E0904 00:53:23.943683 2834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://147.28.180.77:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.28.180.77:6443: connect: connection refused" logger="UnhandledError" Sep 4 00:53:23.944136 kubelet[2834]: W0904 00:53:23.944088 2834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://147.28.180.77:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-fd36784ab7&limit=500&resourceVersion=0": dial tcp 147.28.180.77:6443: connect: connection refused Sep 4 00:53:23.944136 kubelet[2834]: E0904 00:53:23.944117 2834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://147.28.180.77:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-fd36784ab7&limit=500&resourceVersion=0\": dial tcp 147.28.180.77:6443: connect: connection refused" logger="UnhandledError" Sep 4 00:53:23.944273 kubelet[2834]: W0904 00:53:23.944240 2834 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 00:53:23.945820 kubelet[2834]: I0904 00:53:23.945779 2834 server.go:1274] "Started kubelet" Sep 4 00:53:23.945871 kubelet[2834]: I0904 00:53:23.945834 2834 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 00:53:23.945889 kubelet[2834]: I0904 00:53:23.945833 2834 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 00:53:23.946173 kubelet[2834]: I0904 00:53:23.946162 2834 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 00:53:23.946523 kubelet[2834]: I0904 00:53:23.946516 2834 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 00:53:23.946591 kubelet[2834]: I0904 00:53:23.946576 2834 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 00:53:23.946591 kubelet[2834]: E0904 00:53:23.946584 2834 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-fd36784ab7\" not found" Sep 4 00:53:23.946668 kubelet[2834]: I0904 00:53:23.946597 2834 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 00:53:23.948447 kubelet[2834]: I0904 00:53:23.948364 2834 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 00:53:23.948517 kubelet[2834]: I0904 00:53:23.948484 2834 reconciler.go:26] "Reconciler: start to sync state" Sep 4 00:53:23.948604 kubelet[2834]: I0904 00:53:23.948594 2834 factory.go:221] Registration of the systemd container factory successfully Sep 4 00:53:23.948634 kubelet[2834]: W0904 00:53:23.948586 2834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://147.28.180.77:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 147.28.180.77:6443: connect: connection refused Sep 4 00:53:23.948669 kubelet[2834]: E0904 00:53:23.948631 2834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://147.28.180.77:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.28.180.77:6443: connect: connection refused" logger="UnhandledError" Sep 4 00:53:23.948699 kubelet[2834]: I0904 00:53:23.948677 2834 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 00:53:23.948699 kubelet[2834]: E0904 00:53:23.948680 2834 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 00:53:23.948752 kubelet[2834]: I0904 00:53:23.948728 2834 server.go:449] "Adding debug handlers to kubelet server" Sep 4 00:53:23.948896 kubelet[2834]: E0904 00:53:23.948865 2834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.180.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-fd36784ab7?timeout=10s\": dial tcp 147.28.180.77:6443: connect: connection refused" interval="200ms" Sep 4 00:53:23.949822 kubelet[2834]: I0904 00:53:23.949812 2834 factory.go:221] Registration of the containerd container factory successfully Sep 4 00:53:23.950580 kubelet[2834]: E0904 00:53:23.949647 2834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.28.180.77:6443/api/v1/namespaces/default/events\": dial tcp 147.28.180.77:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-n-fd36784ab7.1861ee2c5c4d6685 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-n-fd36784ab7,UID:ci-4372.1.0-n-fd36784ab7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-n-fd36784ab7,},FirstTimestamp:2025-09-04 00:53:23.945752197 +0000 UTC m=+0.248258908,LastTimestamp:2025-09-04 00:53:23.945752197 +0000 UTC m=+0.248258908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-n-fd36784ab7,}" Sep 4 00:53:23.956109 kubelet[2834]: I0904 00:53:23.956099 2834 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 00:53:23.956109 kubelet[2834]: I0904 00:53:23.956107 2834 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 00:53:23.956175 kubelet[2834]: I0904 00:53:23.956121 2834 state_mem.go:36] "Initialized new in-memory state store" Sep 4 00:53:23.957104 kubelet[2834]: I0904 00:53:23.957091 2834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 00:53:23.957228 kubelet[2834]: I0904 00:53:23.957223 2834 policy_none.go:49] "None policy: Start" Sep 4 00:53:23.957497 kubelet[2834]: I0904 00:53:23.957489 2834 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 00:53:23.957533 kubelet[2834]: I0904 00:53:23.957501 2834 state_mem.go:35] "Initializing new in-memory state store" Sep 4 00:53:23.957734 kubelet[2834]: I0904 00:53:23.957725 2834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 00:53:23.957769 kubelet[2834]: I0904 00:53:23.957738 2834 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 00:53:23.957769 kubelet[2834]: I0904 00:53:23.957750 2834 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 00:53:23.957803 kubelet[2834]: E0904 00:53:23.957775 2834 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 00:53:23.957957 kubelet[2834]: W0904 00:53:23.957943 2834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://147.28.180.77:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 147.28.180.77:6443: connect: connection refused Sep 4 00:53:23.957988 kubelet[2834]: E0904 00:53:23.957968 2834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://147.28.180.77:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.28.180.77:6443: connect: connection refused" logger="UnhandledError" Sep 4 00:53:23.960411 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 00:53:23.986442 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 00:53:23.995105 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 00:53:24.008029 kubelet[2834]: I0904 00:53:24.007942 2834 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 00:53:24.008428 kubelet[2834]: I0904 00:53:24.008348 2834 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 00:53:24.008428 kubelet[2834]: I0904 00:53:24.008378 2834 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 00:53:24.008836 kubelet[2834]: I0904 00:53:24.008787 2834 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 00:53:24.013368 kubelet[2834]: E0904 00:53:24.013307 2834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-n-fd36784ab7\" not found" Sep 4 00:53:24.080305 systemd[1]: Created slice kubepods-burstable-pod53a2760a00e81fda420eb5df21abd310.slice - libcontainer container kubepods-burstable-pod53a2760a00e81fda420eb5df21abd310.slice. Sep 4 00:53:24.103948 systemd[1]: Created slice kubepods-burstable-pod3f13667d297a12616f865803f5e850b5.slice - libcontainer container kubepods-burstable-pod3f13667d297a12616f865803f5e850b5.slice. Sep 4 00:53:24.111688 kubelet[2834]: I0904 00:53:24.111618 2834 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.112371 kubelet[2834]: E0904 00:53:24.112297 2834 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://147.28.180.77:6443/api/v1/nodes\": dial tcp 147.28.180.77:6443: connect: connection refused" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.113883 systemd[1]: Created slice kubepods-burstable-pod304df835b1d79104583c840a59817ea3.slice - libcontainer container kubepods-burstable-pod304df835b1d79104583c840a59817ea3.slice. Sep 4 00:53:24.149995 kubelet[2834]: E0904 00:53:24.149864 2834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.180.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-fd36784ab7?timeout=10s\": dial tcp 147.28.180.77:6443: connect: connection refused" interval="400ms" Sep 4 00:53:24.250217 kubelet[2834]: I0904 00:53:24.249987 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250217 kubelet[2834]: I0904 00:53:24.250107 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250582 kubelet[2834]: I0904 00:53:24.250225 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250582 kubelet[2834]: I0904 00:53:24.250279 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250582 kubelet[2834]: I0904 00:53:24.250354 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250582 kubelet[2834]: I0904 00:53:24.250418 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.250582 kubelet[2834]: I0904 00:53:24.250472 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/53a2760a00e81fda420eb5df21abd310-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-fd36784ab7\" (UID: \"53a2760a00e81fda420eb5df21abd310\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.251056 kubelet[2834]: I0904 00:53:24.250566 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.251056 kubelet[2834]: I0904 00:53:24.250649 2834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.317326 kubelet[2834]: I0904 00:53:24.317258 2834 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.318044 kubelet[2834]: E0904 00:53:24.317978 2834 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://147.28.180.77:6443/api/v1/nodes\": dial tcp 147.28.180.77:6443: connect: connection refused" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:24.399747 containerd[1910]: time="2025-09-04T00:53:24.399649254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-fd36784ab7,Uid:53a2760a00e81fda420eb5df21abd310,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:24.410289 containerd[1910]: time="2025-09-04T00:53:24.410273649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-fd36784ab7,Uid:3f13667d297a12616f865803f5e850b5,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:24.416258 containerd[1910]: time="2025-09-04T00:53:24.416233243Z" level=info msg="connecting to shim 0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7" address="unix:///run/containerd/s/f26aa14725268a5801d9adb0a277a0231d57a3a483e078a0145d6e89a70914f0" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:24.418390 containerd[1910]: time="2025-09-04T00:53:24.418369827Z" level=info msg="connecting to shim 4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743" address="unix:///run/containerd/s/a7521b678c0f4e7f4efb98b8602e983f1a6a4ec46c7a12f2f16169bbb56c70c0" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:24.419927 containerd[1910]: time="2025-09-04T00:53:24.419899745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-fd36784ab7,Uid:304df835b1d79104583c840a59817ea3,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:24.426918 containerd[1910]: time="2025-09-04T00:53:24.426887795Z" level=info msg="connecting to shim 765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8" address="unix:///run/containerd/s/4a6d49f5146a1b48ed416ddad405df1a6d2ff936569f87f6ae758c8518a10a28" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:24.445505 systemd[1]: Started cri-containerd-0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7.scope - libcontainer container 0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7. Sep 4 00:53:24.446434 systemd[1]: Started cri-containerd-4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743.scope - libcontainer container 4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743. Sep 4 00:53:24.448654 systemd[1]: Started cri-containerd-765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8.scope - libcontainer container 765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8. Sep 4 00:53:24.472717 containerd[1910]: time="2025-09-04T00:53:24.472687702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-fd36784ab7,Uid:53a2760a00e81fda420eb5df21abd310,Namespace:kube-system,Attempt:0,} returns sandbox id \"0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7\"" Sep 4 00:53:24.473785 containerd[1910]: time="2025-09-04T00:53:24.473769213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-fd36784ab7,Uid:3f13667d297a12616f865803f5e850b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743\"" Sep 4 00:53:24.474361 containerd[1910]: time="2025-09-04T00:53:24.474346666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-fd36784ab7,Uid:304df835b1d79104583c840a59817ea3,Namespace:kube-system,Attempt:0,} returns sandbox id \"765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8\"" Sep 4 00:53:24.474414 containerd[1910]: time="2025-09-04T00:53:24.474399319Z" level=info msg="CreateContainer within sandbox \"0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 00:53:24.474521 containerd[1910]: time="2025-09-04T00:53:24.474511265Z" level=info msg="CreateContainer within sandbox \"4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 00:53:24.475092 containerd[1910]: time="2025-09-04T00:53:24.475080086Z" level=info msg="CreateContainer within sandbox \"765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 00:53:24.478231 containerd[1910]: time="2025-09-04T00:53:24.478190055Z" level=info msg="Container acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:24.479234 containerd[1910]: time="2025-09-04T00:53:24.479195820Z" level=info msg="Container fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:24.479684 containerd[1910]: time="2025-09-04T00:53:24.479647485Z" level=info msg="Container 7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:24.481035 containerd[1910]: time="2025-09-04T00:53:24.481024783Z" level=info msg="CreateContainer within sandbox \"0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002\"" Sep 4 00:53:24.481338 containerd[1910]: time="2025-09-04T00:53:24.481301880Z" level=info msg="StartContainer for \"acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002\"" Sep 4 00:53:24.481874 containerd[1910]: time="2025-09-04T00:53:24.481837492Z" level=info msg="connecting to shim acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002" address="unix:///run/containerd/s/f26aa14725268a5801d9adb0a277a0231d57a3a483e078a0145d6e89a70914f0" protocol=ttrpc version=3 Sep 4 00:53:24.482122 containerd[1910]: time="2025-09-04T00:53:24.482107845Z" level=info msg="CreateContainer within sandbox \"765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7\"" Sep 4 00:53:24.482258 containerd[1910]: time="2025-09-04T00:53:24.482248910Z" level=info msg="StartContainer for \"7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7\"" Sep 4 00:53:24.482464 containerd[1910]: time="2025-09-04T00:53:24.482454530Z" level=info msg="CreateContainer within sandbox \"4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee\"" Sep 4 00:53:24.482586 containerd[1910]: time="2025-09-04T00:53:24.482576743Z" level=info msg="StartContainer for \"fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee\"" Sep 4 00:53:24.482754 containerd[1910]: time="2025-09-04T00:53:24.482743477Z" level=info msg="connecting to shim 7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7" address="unix:///run/containerd/s/4a6d49f5146a1b48ed416ddad405df1a6d2ff936569f87f6ae758c8518a10a28" protocol=ttrpc version=3 Sep 4 00:53:24.483080 containerd[1910]: time="2025-09-04T00:53:24.483067342Z" level=info msg="connecting to shim fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee" address="unix:///run/containerd/s/a7521b678c0f4e7f4efb98b8602e983f1a6a4ec46c7a12f2f16169bbb56c70c0" protocol=ttrpc version=3 Sep 4 00:53:24.506434 systemd[1]: Started cri-containerd-7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7.scope - libcontainer container 7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7. Sep 4 00:53:24.507039 systemd[1]: Started cri-containerd-acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002.scope - libcontainer container acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002. Sep 4 00:53:24.507716 systemd[1]: Started cri-containerd-fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee.scope - libcontainer container fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee. Sep 4 00:53:24.539980 containerd[1910]: time="2025-09-04T00:53:24.539950984Z" level=info msg="StartContainer for \"7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7\" returns successfully" Sep 4 00:53:24.540067 containerd[1910]: time="2025-09-04T00:53:24.539984465Z" level=info msg="StartContainer for \"acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002\" returns successfully" Sep 4 00:53:24.541530 containerd[1910]: time="2025-09-04T00:53:24.541514741Z" level=info msg="StartContainer for \"fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee\" returns successfully" Sep 4 00:53:24.550257 kubelet[2834]: E0904 00:53:24.550230 2834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.180.77:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-fd36784ab7?timeout=10s\": dial tcp 147.28.180.77:6443: connect: connection refused" interval="800ms" Sep 4 00:53:24.720091 kubelet[2834]: I0904 00:53:24.720072 2834 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:25.268276 kubelet[2834]: I0904 00:53:25.268176 2834 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:25.268276 kubelet[2834]: E0904 00:53:25.268224 2834 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4372.1.0-n-fd36784ab7\": node \"ci-4372.1.0-n-fd36784ab7\" not found" Sep 4 00:53:25.942730 kubelet[2834]: I0904 00:53:25.942632 2834 apiserver.go:52] "Watching apiserver" Sep 4 00:53:25.949419 kubelet[2834]: I0904 00:53:25.949327 2834 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 00:53:25.973174 kubelet[2834]: E0904 00:53:25.973068 2834 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:27.638026 systemd[1]: Reload requested from client PID 3147 ('systemctl') (unit session-11.scope)... Sep 4 00:53:27.638033 systemd[1]: Reloading... Sep 4 00:53:27.678169 zram_generator::config[3194]: No configuration found. Sep 4 00:53:27.737884 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 00:53:27.835675 systemd[1]: Reloading finished in 197 ms. Sep 4 00:53:27.856444 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:27.874438 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 00:53:27.875026 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:27.875172 systemd[1]: kubelet.service: Consumed 784ms CPU time, 137.3M memory peak. Sep 4 00:53:27.879596 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 00:53:28.184000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 00:53:28.202512 (kubelet)[3260]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 00:53:28.250087 kubelet[3260]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 00:53:28.250087 kubelet[3260]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 00:53:28.250087 kubelet[3260]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 00:53:28.250490 kubelet[3260]: I0904 00:53:28.250156 3260 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 00:53:28.257754 kubelet[3260]: I0904 00:53:28.257702 3260 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 00:53:28.257754 kubelet[3260]: I0904 00:53:28.257725 3260 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 00:53:28.258019 kubelet[3260]: I0904 00:53:28.257985 3260 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 00:53:28.259517 kubelet[3260]: I0904 00:53:28.259471 3260 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 00:53:28.263081 kubelet[3260]: I0904 00:53:28.263030 3260 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 00:53:28.266626 kubelet[3260]: I0904 00:53:28.266580 3260 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 00:53:28.279171 kubelet[3260]: I0904 00:53:28.279149 3260 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 00:53:28.279303 kubelet[3260]: I0904 00:53:28.279257 3260 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 00:53:28.279399 kubelet[3260]: I0904 00:53:28.279370 3260 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 00:53:28.279574 kubelet[3260]: I0904 00:53:28.279395 3260 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-fd36784ab7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 00:53:28.279666 kubelet[3260]: I0904 00:53:28.279587 3260 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 00:53:28.279666 kubelet[3260]: I0904 00:53:28.279600 3260 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 00:53:28.279666 kubelet[3260]: I0904 00:53:28.279626 3260 state_mem.go:36] "Initialized new in-memory state store" Sep 4 00:53:28.279767 kubelet[3260]: I0904 00:53:28.279731 3260 kubelet.go:408] "Attempting to sync node with API server" Sep 4 00:53:28.279767 kubelet[3260]: I0904 00:53:28.279748 3260 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 00:53:28.279835 kubelet[3260]: I0904 00:53:28.279777 3260 kubelet.go:314] "Adding apiserver pod source" Sep 4 00:53:28.279835 kubelet[3260]: I0904 00:53:28.279788 3260 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 00:53:28.280305 kubelet[3260]: I0904 00:53:28.280277 3260 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 4 00:53:28.280856 kubelet[3260]: I0904 00:53:28.280842 3260 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 00:53:28.281302 kubelet[3260]: I0904 00:53:28.281284 3260 server.go:1274] "Started kubelet" Sep 4 00:53:28.281429 kubelet[3260]: I0904 00:53:28.281389 3260 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 00:53:28.281506 kubelet[3260]: I0904 00:53:28.281392 3260 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 00:53:28.281719 kubelet[3260]: I0904 00:53:28.281702 3260 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 00:53:28.282914 kubelet[3260]: I0904 00:53:28.282897 3260 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 00:53:28.282973 kubelet[3260]: I0904 00:53:28.282912 3260 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 00:53:28.283033 kubelet[3260]: I0904 00:53:28.282979 3260 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 00:53:28.283102 kubelet[3260]: I0904 00:53:28.283056 3260 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 00:53:28.283165 kubelet[3260]: E0904 00:53:28.283095 3260 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-fd36784ab7\" not found" Sep 4 00:53:28.283261 kubelet[3260]: I0904 00:53:28.283243 3260 reconciler.go:26] "Reconciler: start to sync state" Sep 4 00:53:28.283404 kubelet[3260]: E0904 00:53:28.283364 3260 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 00:53:28.283511 kubelet[3260]: I0904 00:53:28.283495 3260 server.go:449] "Adding debug handlers to kubelet server" Sep 4 00:53:28.286634 kubelet[3260]: I0904 00:53:28.286607 3260 factory.go:221] Registration of the systemd container factory successfully Sep 4 00:53:28.286814 kubelet[3260]: I0904 00:53:28.286766 3260 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 00:53:28.288651 kubelet[3260]: I0904 00:53:28.288625 3260 factory.go:221] Registration of the containerd container factory successfully Sep 4 00:53:28.294692 kubelet[3260]: I0904 00:53:28.294655 3260 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 00:53:28.295846 kubelet[3260]: I0904 00:53:28.295831 3260 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 00:53:28.295919 kubelet[3260]: I0904 00:53:28.295851 3260 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 00:53:28.295919 kubelet[3260]: I0904 00:53:28.295872 3260 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 00:53:28.296010 kubelet[3260]: E0904 00:53:28.295933 3260 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 00:53:28.317820 kubelet[3260]: I0904 00:53:28.317767 3260 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 00:53:28.317820 kubelet[3260]: I0904 00:53:28.317783 3260 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 00:53:28.317820 kubelet[3260]: I0904 00:53:28.317803 3260 state_mem.go:36] "Initialized new in-memory state store" Sep 4 00:53:28.318022 kubelet[3260]: I0904 00:53:28.317965 3260 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 00:53:28.318022 kubelet[3260]: I0904 00:53:28.317984 3260 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 00:53:28.318022 kubelet[3260]: I0904 00:53:28.318009 3260 policy_none.go:49] "None policy: Start" Sep 4 00:53:28.318623 kubelet[3260]: I0904 00:53:28.318575 3260 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 00:53:28.318623 kubelet[3260]: I0904 00:53:28.318601 3260 state_mem.go:35] "Initializing new in-memory state store" Sep 4 00:53:28.318769 kubelet[3260]: I0904 00:53:28.318747 3260 state_mem.go:75] "Updated machine memory state" Sep 4 00:53:28.323368 kubelet[3260]: I0904 00:53:28.323331 3260 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 00:53:28.323524 kubelet[3260]: I0904 00:53:28.323507 3260 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 00:53:28.323600 kubelet[3260]: I0904 00:53:28.323523 3260 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 00:53:28.323726 kubelet[3260]: I0904 00:53:28.323698 3260 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 00:53:28.403953 kubelet[3260]: W0904 00:53:28.403878 3260 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 00:53:28.403953 kubelet[3260]: W0904 00:53:28.403970 3260 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 00:53:28.404729 kubelet[3260]: W0904 00:53:28.404676 3260 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 00:53:28.430244 kubelet[3260]: I0904 00:53:28.430151 3260 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.456993 kubelet[3260]: I0904 00:53:28.456845 3260 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.456993 kubelet[3260]: I0904 00:53:28.456979 3260 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.584659 kubelet[3260]: I0904 00:53:28.584516 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585020 kubelet[3260]: I0904 00:53:28.584652 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585020 kubelet[3260]: I0904 00:53:28.584756 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585020 kubelet[3260]: I0904 00:53:28.584805 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585020 kubelet[3260]: I0904 00:53:28.584850 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/53a2760a00e81fda420eb5df21abd310-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-fd36784ab7\" (UID: \"53a2760a00e81fda420eb5df21abd310\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585020 kubelet[3260]: I0904 00:53:28.584893 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585536 kubelet[3260]: I0904 00:53:28.584934 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f13667d297a12616f865803f5e850b5-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" (UID: \"3f13667d297a12616f865803f5e850b5\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585536 kubelet[3260]: I0904 00:53:28.585030 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:28.585536 kubelet[3260]: I0904 00:53:28.585159 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/304df835b1d79104583c840a59817ea3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-fd36784ab7\" (UID: \"304df835b1d79104583c840a59817ea3\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:29.280463 kubelet[3260]: I0904 00:53:29.280339 3260 apiserver.go:52] "Watching apiserver" Sep 4 00:53:29.316583 kubelet[3260]: W0904 00:53:29.316514 3260 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 00:53:29.316814 kubelet[3260]: E0904 00:53:29.316685 3260 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4372.1.0-n-fd36784ab7\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:29.355177 kubelet[3260]: I0904 00:53:29.355098 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-n-fd36784ab7" podStartSLOduration=1.355077208 podStartE2EDuration="1.355077208s" podCreationTimestamp="2025-09-04 00:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:53:29.347202656 +0000 UTC m=+1.141312819" watchObservedRunningTime="2025-09-04 00:53:29.355077208 +0000 UTC m=+1.149187365" Sep 4 00:53:29.363031 kubelet[3260]: I0904 00:53:29.362976 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-n-fd36784ab7" podStartSLOduration=1.362951903 podStartE2EDuration="1.362951903s" podCreationTimestamp="2025-09-04 00:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:53:29.3552112 +0000 UTC m=+1.149321367" watchObservedRunningTime="2025-09-04 00:53:29.362951903 +0000 UTC m=+1.157062067" Sep 4 00:53:29.372296 kubelet[3260]: I0904 00:53:29.372205 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-fd36784ab7" podStartSLOduration=1.3721871829999999 podStartE2EDuration="1.372187183s" podCreationTimestamp="2025-09-04 00:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:53:29.36317139 +0000 UTC m=+1.157281555" watchObservedRunningTime="2025-09-04 00:53:29.372187183 +0000 UTC m=+1.166297339" Sep 4 00:53:29.384434 kubelet[3260]: I0904 00:53:29.384388 3260 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 00:53:32.562169 kubelet[3260]: I0904 00:53:32.562061 3260 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 00:53:32.562994 containerd[1910]: time="2025-09-04T00:53:32.562722467Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 00:53:32.563710 kubelet[3260]: I0904 00:53:32.563177 3260 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 00:53:33.613143 systemd[1]: Created slice kubepods-besteffort-pod31a2940a_d91f_4b88_8d65_4e39558c833e.slice - libcontainer container kubepods-besteffort-pod31a2940a_d91f_4b88_8d65_4e39558c833e.slice. Sep 4 00:53:33.618882 kubelet[3260]: I0904 00:53:33.618792 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-997ks\" (UniqueName: \"kubernetes.io/projected/31a2940a-d91f-4b88-8d65-4e39558c833e-kube-api-access-997ks\") pod \"kube-proxy-pm9br\" (UID: \"31a2940a-d91f-4b88-8d65-4e39558c833e\") " pod="kube-system/kube-proxy-pm9br" Sep 4 00:53:33.619996 kubelet[3260]: I0904 00:53:33.618905 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/31a2940a-d91f-4b88-8d65-4e39558c833e-xtables-lock\") pod \"kube-proxy-pm9br\" (UID: \"31a2940a-d91f-4b88-8d65-4e39558c833e\") " pod="kube-system/kube-proxy-pm9br" Sep 4 00:53:33.619996 kubelet[3260]: I0904 00:53:33.618960 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31a2940a-d91f-4b88-8d65-4e39558c833e-lib-modules\") pod \"kube-proxy-pm9br\" (UID: \"31a2940a-d91f-4b88-8d65-4e39558c833e\") " pod="kube-system/kube-proxy-pm9br" Sep 4 00:53:33.619996 kubelet[3260]: I0904 00:53:33.619012 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/31a2940a-d91f-4b88-8d65-4e39558c833e-kube-proxy\") pod \"kube-proxy-pm9br\" (UID: \"31a2940a-d91f-4b88-8d65-4e39558c833e\") " pod="kube-system/kube-proxy-pm9br" Sep 4 00:53:33.693674 systemd[1]: Created slice kubepods-besteffort-pod59d37f89_bbcf_4238_93f6_c58cc8cef799.slice - libcontainer container kubepods-besteffort-pod59d37f89_bbcf_4238_93f6_c58cc8cef799.slice. Sep 4 00:53:33.719836 kubelet[3260]: I0904 00:53:33.719775 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmsc\" (UniqueName: \"kubernetes.io/projected/59d37f89-bbcf-4238-93f6-c58cc8cef799-kube-api-access-dsmsc\") pod \"tigera-operator-58fc44c59b-tj4pq\" (UID: \"59d37f89-bbcf-4238-93f6-c58cc8cef799\") " pod="tigera-operator/tigera-operator-58fc44c59b-tj4pq" Sep 4 00:53:33.719994 kubelet[3260]: I0904 00:53:33.719923 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/59d37f89-bbcf-4238-93f6-c58cc8cef799-var-lib-calico\") pod \"tigera-operator-58fc44c59b-tj4pq\" (UID: \"59d37f89-bbcf-4238-93f6-c58cc8cef799\") " pod="tigera-operator/tigera-operator-58fc44c59b-tj4pq" Sep 4 00:53:33.935245 containerd[1910]: time="2025-09-04T00:53:33.935108866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pm9br,Uid:31a2940a-d91f-4b88-8d65-4e39558c833e,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:33.950634 containerd[1910]: time="2025-09-04T00:53:33.950587672Z" level=info msg="connecting to shim eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e" address="unix:///run/containerd/s/174c052a7ac43c6dd789bd2c8366f6a37e1e3c1dccbacfe48a0131d3424075fb" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:33.967269 systemd[1]: Started cri-containerd-eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e.scope - libcontainer container eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e. Sep 4 00:53:33.996335 containerd[1910]: time="2025-09-04T00:53:33.996301039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-tj4pq,Uid:59d37f89-bbcf-4238-93f6-c58cc8cef799,Namespace:tigera-operator,Attempt:0,}" Sep 4 00:53:34.078488 containerd[1910]: time="2025-09-04T00:53:34.078367935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pm9br,Uid:31a2940a-d91f-4b88-8d65-4e39558c833e,Namespace:kube-system,Attempt:0,} returns sandbox id \"eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e\"" Sep 4 00:53:34.083985 containerd[1910]: time="2025-09-04T00:53:34.083882336Z" level=info msg="CreateContainer within sandbox \"eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 00:53:34.437402 containerd[1910]: time="2025-09-04T00:53:34.437355400Z" level=info msg="Container f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:34.440978 containerd[1910]: time="2025-09-04T00:53:34.440964974Z" level=info msg="CreateContainer within sandbox \"eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7\"" Sep 4 00:53:34.441330 containerd[1910]: time="2025-09-04T00:53:34.441316707Z" level=info msg="StartContainer for \"f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7\"" Sep 4 00:53:34.442034 containerd[1910]: time="2025-09-04T00:53:34.441995412Z" level=info msg="connecting to shim 77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9" address="unix:///run/containerd/s/6a3298a10ed9da21c7ad2d8b2e32f78081fd3585207e602785a29d434be96bf8" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:34.442196 containerd[1910]: time="2025-09-04T00:53:34.442160459Z" level=info msg="connecting to shim f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7" address="unix:///run/containerd/s/174c052a7ac43c6dd789bd2c8366f6a37e1e3c1dccbacfe48a0131d3424075fb" protocol=ttrpc version=3 Sep 4 00:53:34.468401 systemd[1]: Started cri-containerd-f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7.scope - libcontainer container f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7. Sep 4 00:53:34.470694 systemd[1]: Started cri-containerd-77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9.scope - libcontainer container 77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9. Sep 4 00:53:34.489030 containerd[1910]: time="2025-09-04T00:53:34.489009270Z" level=info msg="StartContainer for \"f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7\" returns successfully" Sep 4 00:53:34.496935 containerd[1910]: time="2025-09-04T00:53:34.496913174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-tj4pq,Uid:59d37f89-bbcf-4238-93f6-c58cc8cef799,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9\"" Sep 4 00:53:34.497617 containerd[1910]: time="2025-09-04T00:53:34.497604589Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 00:53:35.330209 kubelet[3260]: I0904 00:53:35.330157 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pm9br" podStartSLOduration=2.330135651 podStartE2EDuration="2.330135651s" podCreationTimestamp="2025-09-04 00:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:53:35.330047227 +0000 UTC m=+7.124157379" watchObservedRunningTime="2025-09-04 00:53:35.330135651 +0000 UTC m=+7.124245801" Sep 4 00:53:35.740625 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3000758192.mount: Deactivated successfully. Sep 4 00:53:36.083880 containerd[1910]: time="2025-09-04T00:53:36.083814490Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:36.084096 containerd[1910]: time="2025-09-04T00:53:36.083934771Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 00:53:36.084287 containerd[1910]: time="2025-09-04T00:53:36.084273936Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:36.085319 containerd[1910]: time="2025-09-04T00:53:36.085304965Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:36.085709 containerd[1910]: time="2025-09-04T00:53:36.085698301Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.588078023s" Sep 4 00:53:36.085730 containerd[1910]: time="2025-09-04T00:53:36.085712765Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 00:53:36.086557 containerd[1910]: time="2025-09-04T00:53:36.086545556Z" level=info msg="CreateContainer within sandbox \"77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 00:53:36.089103 containerd[1910]: time="2025-09-04T00:53:36.089091062Z" level=info msg="Container ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:36.091055 containerd[1910]: time="2025-09-04T00:53:36.091041661Z" level=info msg="CreateContainer within sandbox \"77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3\"" Sep 4 00:53:36.091273 containerd[1910]: time="2025-09-04T00:53:36.091257823Z" level=info msg="StartContainer for \"ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3\"" Sep 4 00:53:36.091809 containerd[1910]: time="2025-09-04T00:53:36.091770409Z" level=info msg="connecting to shim ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3" address="unix:///run/containerd/s/6a3298a10ed9da21c7ad2d8b2e32f78081fd3585207e602785a29d434be96bf8" protocol=ttrpc version=3 Sep 4 00:53:36.113228 systemd[1]: Started cri-containerd-ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3.scope - libcontainer container ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3. Sep 4 00:53:36.128563 containerd[1910]: time="2025-09-04T00:53:36.128507551Z" level=info msg="StartContainer for \"ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3\" returns successfully" Sep 4 00:53:36.360050 kubelet[3260]: I0904 00:53:36.359809 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-tj4pq" podStartSLOduration=1.771121941 podStartE2EDuration="3.35977057s" podCreationTimestamp="2025-09-04 00:53:33 +0000 UTC" firstStartedPulling="2025-09-04 00:53:34.49739292 +0000 UTC m=+6.291503057" lastFinishedPulling="2025-09-04 00:53:36.086041547 +0000 UTC m=+7.880151686" observedRunningTime="2025-09-04 00:53:36.359521978 +0000 UTC m=+8.153632179" watchObservedRunningTime="2025-09-04 00:53:36.35977057 +0000 UTC m=+8.153880759" Sep 4 00:53:40.453107 sudo[2210]: pam_unix(sudo:session): session closed for user root Sep 4 00:53:40.453835 sshd[2209]: Connection closed by 147.75.109.163 port 46154 Sep 4 00:53:40.453984 sshd-session[2207]: pam_unix(sshd:session): session closed for user core Sep 4 00:53:40.455686 systemd[1]: sshd@9-147.28.180.77:22-147.75.109.163:46154.service: Deactivated successfully. Sep 4 00:53:40.456842 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 00:53:40.456975 systemd[1]: session-11.scope: Consumed 3.762s CPU time, 232.5M memory peak. Sep 4 00:53:40.458364 systemd-logind[1901]: Session 11 logged out. Waiting for processes to exit. Sep 4 00:53:40.458965 systemd-logind[1901]: Removed session 11. Sep 4 00:53:40.924663 update_engine[1903]: I20250904 00:53:40.924626 1903 update_attempter.cc:509] Updating boot flags... Sep 4 00:53:42.673447 systemd[1]: Created slice kubepods-besteffort-pod23fa3699_5293_4e33_8d9e_e2199f46e463.slice - libcontainer container kubepods-besteffort-pod23fa3699_5293_4e33_8d9e_e2199f46e463.slice. Sep 4 00:53:42.682080 kubelet[3260]: I0904 00:53:42.682017 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fa3699-5293-4e33-8d9e-e2199f46e463-tigera-ca-bundle\") pod \"calico-typha-8d49f79fb-5gph7\" (UID: \"23fa3699-5293-4e33-8d9e-e2199f46e463\") " pod="calico-system/calico-typha-8d49f79fb-5gph7" Sep 4 00:53:42.682080 kubelet[3260]: I0904 00:53:42.682058 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/23fa3699-5293-4e33-8d9e-e2199f46e463-typha-certs\") pod \"calico-typha-8d49f79fb-5gph7\" (UID: \"23fa3699-5293-4e33-8d9e-e2199f46e463\") " pod="calico-system/calico-typha-8d49f79fb-5gph7" Sep 4 00:53:42.682080 kubelet[3260]: I0904 00:53:42.682074 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwj4l\" (UniqueName: \"kubernetes.io/projected/23fa3699-5293-4e33-8d9e-e2199f46e463-kube-api-access-gwj4l\") pod \"calico-typha-8d49f79fb-5gph7\" (UID: \"23fa3699-5293-4e33-8d9e-e2199f46e463\") " pod="calico-system/calico-typha-8d49f79fb-5gph7" Sep 4 00:53:42.977272 containerd[1910]: time="2025-09-04T00:53:42.977013934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8d49f79fb-5gph7,Uid:23fa3699-5293-4e33-8d9e-e2199f46e463,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:42.986142 containerd[1910]: time="2025-09-04T00:53:42.986105281Z" level=info msg="connecting to shim 81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697" address="unix:///run/containerd/s/f1077874028dbd1ad5364290d509cc32a19b85cc3c288416459b1d09dc86f896" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:42.993207 systemd[1]: Created slice kubepods-besteffort-podf78cd6b7_0635_46c9_9aa3_1aeb34ad8656.slice - libcontainer container kubepods-besteffort-podf78cd6b7_0635_46c9_9aa3_1aeb34ad8656.slice. Sep 4 00:53:43.006248 systemd[1]: Started cri-containerd-81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697.scope - libcontainer container 81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697. Sep 4 00:53:43.031232 containerd[1910]: time="2025-09-04T00:53:43.031210297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8d49f79fb-5gph7,Uid:23fa3699-5293-4e33-8d9e-e2199f46e463,Namespace:calico-system,Attempt:0,} returns sandbox id \"81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697\"" Sep 4 00:53:43.031875 containerd[1910]: time="2025-09-04T00:53:43.031863178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 00:53:43.083978 kubelet[3260]: I0904 00:53:43.083952 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-flexvol-driver-host\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.083978 kubelet[3260]: I0904 00:53:43.083980 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-policysync\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084122 kubelet[3260]: I0904 00:53:43.083995 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-lib-modules\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084122 kubelet[3260]: I0904 00:53:43.084007 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmks\" (UniqueName: \"kubernetes.io/projected/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-kube-api-access-2rmks\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084122 kubelet[3260]: I0904 00:53:43.084017 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-cni-net-dir\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084122 kubelet[3260]: I0904 00:53:43.084028 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-tigera-ca-bundle\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084122 kubelet[3260]: I0904 00:53:43.084038 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-var-lib-calico\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084243 kubelet[3260]: I0904 00:53:43.084050 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-cni-log-dir\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084243 kubelet[3260]: I0904 00:53:43.084062 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-var-run-calico\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084243 kubelet[3260]: I0904 00:53:43.084076 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-node-certs\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084243 kubelet[3260]: I0904 00:53:43.084089 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-xtables-lock\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.084243 kubelet[3260]: I0904 00:53:43.084131 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f78cd6b7-0635-46c9-9aa3-1aeb34ad8656-cni-bin-dir\") pod \"calico-node-6qrmc\" (UID: \"f78cd6b7-0635-46c9-9aa3-1aeb34ad8656\") " pod="calico-system/calico-node-6qrmc" Sep 4 00:53:43.187750 kubelet[3260]: E0904 00:53:43.187691 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.187750 kubelet[3260]: W0904 00:53:43.187748 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.188077 kubelet[3260]: E0904 00:53:43.187807 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.191902 kubelet[3260]: E0904 00:53:43.191852 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.191902 kubelet[3260]: W0904 00:53:43.191896 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.192328 kubelet[3260]: E0904 00:53:43.191935 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.202045 kubelet[3260]: E0904 00:53:43.202001 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.202045 kubelet[3260]: W0904 00:53:43.202035 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.202482 kubelet[3260]: E0904 00:53:43.202070 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.232846 kubelet[3260]: E0904 00:53:43.232724 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:43.282864 kubelet[3260]: E0904 00:53:43.282820 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.282864 kubelet[3260]: W0904 00:53:43.282852 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.283148 kubelet[3260]: E0904 00:53:43.282887 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.283314 kubelet[3260]: E0904 00:53:43.283287 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.283314 kubelet[3260]: W0904 00:53:43.283308 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.283575 kubelet[3260]: E0904 00:53:43.283334 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.284549 kubelet[3260]: E0904 00:53:43.284189 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.284549 kubelet[3260]: W0904 00:53:43.284258 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.284549 kubelet[3260]: E0904 00:53:43.284296 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.285202 kubelet[3260]: E0904 00:53:43.285168 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.285435 kubelet[3260]: W0904 00:53:43.285398 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.285844 kubelet[3260]: E0904 00:53:43.285603 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.286868 kubelet[3260]: E0904 00:53:43.286275 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.286983 kubelet[3260]: W0904 00:53:43.286874 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.286983 kubelet[3260]: E0904 00:53:43.286909 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.287302 kubelet[3260]: E0904 00:53:43.287254 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.287302 kubelet[3260]: W0904 00:53:43.287274 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.287302 kubelet[3260]: E0904 00:53:43.287294 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.287584 kubelet[3260]: E0904 00:53:43.287562 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.287584 kubelet[3260]: W0904 00:53:43.287582 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.287716 kubelet[3260]: E0904 00:53:43.287600 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.287921 kubelet[3260]: E0904 00:53:43.287902 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.287921 kubelet[3260]: W0904 00:53:43.287919 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.288059 kubelet[3260]: E0904 00:53:43.287937 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.288248 kubelet[3260]: E0904 00:53:43.288227 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.288248 kubelet[3260]: W0904 00:53:43.288245 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.288376 kubelet[3260]: E0904 00:53:43.288263 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.288538 kubelet[3260]: E0904 00:53:43.288519 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.288623 kubelet[3260]: W0904 00:53:43.288537 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.288623 kubelet[3260]: E0904 00:53:43.288554 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.288820 kubelet[3260]: E0904 00:53:43.288800 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.288820 kubelet[3260]: W0904 00:53:43.288818 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.288939 kubelet[3260]: E0904 00:53:43.288834 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.289169 kubelet[3260]: E0904 00:53:43.289148 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.289169 kubelet[3260]: W0904 00:53:43.289166 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.289329 kubelet[3260]: E0904 00:53:43.289183 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.289452 kubelet[3260]: E0904 00:53:43.289433 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.289519 kubelet[3260]: W0904 00:53:43.289453 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.289519 kubelet[3260]: E0904 00:53:43.289470 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.289720 kubelet[3260]: E0904 00:53:43.289700 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.289720 kubelet[3260]: W0904 00:53:43.289717 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.289857 kubelet[3260]: E0904 00:53:43.289733 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.289982 kubelet[3260]: E0904 00:53:43.289964 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.289982 kubelet[3260]: W0904 00:53:43.289981 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.290108 kubelet[3260]: E0904 00:53:43.289997 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.290264 kubelet[3260]: E0904 00:53:43.290243 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.290264 kubelet[3260]: W0904 00:53:43.290262 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.290411 kubelet[3260]: E0904 00:53:43.290278 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.290543 kubelet[3260]: E0904 00:53:43.290524 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.290617 kubelet[3260]: W0904 00:53:43.290541 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.290617 kubelet[3260]: E0904 00:53:43.290558 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.290796 kubelet[3260]: E0904 00:53:43.290776 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.290796 kubelet[3260]: W0904 00:53:43.290794 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.290951 kubelet[3260]: E0904 00:53:43.290809 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.291051 kubelet[3260]: E0904 00:53:43.291032 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.291131 kubelet[3260]: W0904 00:53:43.291049 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.291131 kubelet[3260]: E0904 00:53:43.291066 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.291331 kubelet[3260]: E0904 00:53:43.291311 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.291331 kubelet[3260]: W0904 00:53:43.291328 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.291475 kubelet[3260]: E0904 00:53:43.291347 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.291745 kubelet[3260]: E0904 00:53:43.291725 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.291819 kubelet[3260]: W0904 00:53:43.291743 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.291819 kubelet[3260]: E0904 00:53:43.291761 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.291819 kubelet[3260]: I0904 00:53:43.291801 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df69562f-87c3-42fe-a794-4eb1c96d7d52-socket-dir\") pod \"csi-node-driver-89q69\" (UID: \"df69562f-87c3-42fe-a794-4eb1c96d7d52\") " pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:43.292203 kubelet[3260]: E0904 00:53:43.292171 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.292278 kubelet[3260]: W0904 00:53:43.292204 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.292278 kubelet[3260]: E0904 00:53:43.292235 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.292538 kubelet[3260]: E0904 00:53:43.292517 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.292617 kubelet[3260]: W0904 00:53:43.292539 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.292617 kubelet[3260]: E0904 00:53:43.292564 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.292900 kubelet[3260]: E0904 00:53:43.292878 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.292900 kubelet[3260]: W0904 00:53:43.292898 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.293029 kubelet[3260]: E0904 00:53:43.292917 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.293029 kubelet[3260]: I0904 00:53:43.292960 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/df69562f-87c3-42fe-a794-4eb1c96d7d52-varrun\") pod \"csi-node-driver-89q69\" (UID: \"df69562f-87c3-42fe-a794-4eb1c96d7d52\") " pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:43.293274 kubelet[3260]: E0904 00:53:43.293251 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.293366 kubelet[3260]: W0904 00:53:43.293272 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.293366 kubelet[3260]: E0904 00:53:43.293296 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.293366 kubelet[3260]: I0904 00:53:43.293328 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df69562f-87c3-42fe-a794-4eb1c96d7d52-registration-dir\") pod \"csi-node-driver-89q69\" (UID: \"df69562f-87c3-42fe-a794-4eb1c96d7d52\") " pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:43.293748 kubelet[3260]: E0904 00:53:43.293721 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.293831 kubelet[3260]: W0904 00:53:43.293749 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.293831 kubelet[3260]: E0904 00:53:43.293780 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.294070 kubelet[3260]: E0904 00:53:43.294049 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.294230 kubelet[3260]: W0904 00:53:43.294070 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.294230 kubelet[3260]: E0904 00:53:43.294093 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.294482 kubelet[3260]: E0904 00:53:43.294428 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.294482 kubelet[3260]: W0904 00:53:43.294457 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.294640 kubelet[3260]: E0904 00:53:43.294486 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.294640 kubelet[3260]: I0904 00:53:43.294530 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df69562f-87c3-42fe-a794-4eb1c96d7d52-kubelet-dir\") pod \"csi-node-driver-89q69\" (UID: \"df69562f-87c3-42fe-a794-4eb1c96d7d52\") " pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:43.294840 kubelet[3260]: E0904 00:53:43.294818 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.294914 kubelet[3260]: W0904 00:53:43.294839 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.294914 kubelet[3260]: E0904 00:53:43.294865 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.295154 kubelet[3260]: E0904 00:53:43.295128 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.295154 kubelet[3260]: W0904 00:53:43.295152 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.295304 kubelet[3260]: E0904 00:53:43.295178 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.295507 kubelet[3260]: E0904 00:53:43.295487 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.295580 kubelet[3260]: W0904 00:53:43.295505 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.295580 kubelet[3260]: E0904 00:53:43.295531 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.295700 kubelet[3260]: I0904 00:53:43.295589 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vck4j\" (UniqueName: \"kubernetes.io/projected/df69562f-87c3-42fe-a794-4eb1c96d7d52-kube-api-access-vck4j\") pod \"csi-node-driver-89q69\" (UID: \"df69562f-87c3-42fe-a794-4eb1c96d7d52\") " pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:43.295920 kubelet[3260]: E0904 00:53:43.295884 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.296041 kubelet[3260]: W0904 00:53:43.295919 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.296041 kubelet[3260]: E0904 00:53:43.295946 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.296237 containerd[1910]: time="2025-09-04T00:53:43.296109612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6qrmc,Uid:f78cd6b7-0635-46c9-9aa3-1aeb34ad8656,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:43.296310 kubelet[3260]: E0904 00:53:43.296229 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.296310 kubelet[3260]: W0904 00:53:43.296248 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.296310 kubelet[3260]: E0904 00:53:43.296274 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.296711 kubelet[3260]: E0904 00:53:43.296653 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.296711 kubelet[3260]: W0904 00:53:43.296672 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.296711 kubelet[3260]: E0904 00:53:43.296699 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.296837 kubelet[3260]: E0904 00:53:43.296831 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.296837 kubelet[3260]: W0904 00:53:43.296837 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.296894 kubelet[3260]: E0904 00:53:43.296842 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.303697 containerd[1910]: time="2025-09-04T00:53:43.303669075Z" level=info msg="connecting to shim 1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15" address="unix:///run/containerd/s/e728383f39ee63edb039e5398255863bf36fc982b928585dcdd9ce530fc57c41" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:43.330265 systemd[1]: Started cri-containerd-1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15.scope - libcontainer container 1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15. Sep 4 00:53:43.342336 containerd[1910]: time="2025-09-04T00:53:43.342314608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6qrmc,Uid:f78cd6b7-0635-46c9-9aa3-1aeb34ad8656,Namespace:calico-system,Attempt:0,} returns sandbox id \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\"" Sep 4 00:53:43.396470 kubelet[3260]: E0904 00:53:43.396421 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.396470 kubelet[3260]: W0904 00:53:43.396438 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.396470 kubelet[3260]: E0904 00:53:43.396453 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.396700 kubelet[3260]: E0904 00:53:43.396660 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.396700 kubelet[3260]: W0904 00:53:43.396672 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.396700 kubelet[3260]: E0904 00:53:43.396685 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.396937 kubelet[3260]: E0904 00:53:43.396894 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.396937 kubelet[3260]: W0904 00:53:43.396905 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.396937 kubelet[3260]: E0904 00:53:43.396920 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397074 kubelet[3260]: E0904 00:53:43.397065 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397104 kubelet[3260]: W0904 00:53:43.397073 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397104 kubelet[3260]: E0904 00:53:43.397084 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397239 kubelet[3260]: E0904 00:53:43.397229 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397239 kubelet[3260]: W0904 00:53:43.397238 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397326 kubelet[3260]: E0904 00:53:43.397249 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397435 kubelet[3260]: E0904 00:53:43.397427 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397465 kubelet[3260]: W0904 00:53:43.397434 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397465 kubelet[3260]: E0904 00:53:43.397446 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397578 kubelet[3260]: E0904 00:53:43.397569 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397615 kubelet[3260]: W0904 00:53:43.397578 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397615 kubelet[3260]: E0904 00:53:43.397592 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397757 kubelet[3260]: E0904 00:53:43.397747 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397757 kubelet[3260]: W0904 00:53:43.397755 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397845 kubelet[3260]: E0904 00:53:43.397765 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.397932 kubelet[3260]: E0904 00:53:43.397918 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.397991 kubelet[3260]: W0904 00:53:43.397932 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.397991 kubelet[3260]: E0904 00:53:43.397950 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398091 kubelet[3260]: E0904 00:53:43.398080 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398148 kubelet[3260]: W0904 00:53:43.398090 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398148 kubelet[3260]: E0904 00:53:43.398106 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398231 kubelet[3260]: E0904 00:53:43.398221 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398269 kubelet[3260]: W0904 00:53:43.398230 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398269 kubelet[3260]: E0904 00:53:43.398240 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398375 kubelet[3260]: E0904 00:53:43.398366 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398375 kubelet[3260]: W0904 00:53:43.398374 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398459 kubelet[3260]: E0904 00:53:43.398395 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398546 kubelet[3260]: E0904 00:53:43.398537 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398546 kubelet[3260]: W0904 00:53:43.398545 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398620 kubelet[3260]: E0904 00:53:43.398586 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398702 kubelet[3260]: E0904 00:53:43.398693 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398702 kubelet[3260]: W0904 00:53:43.398702 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398781 kubelet[3260]: E0904 00:53:43.398750 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.398828 kubelet[3260]: E0904 00:53:43.398819 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.398828 kubelet[3260]: W0904 00:53:43.398827 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.398896 kubelet[3260]: E0904 00:53:43.398837 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399035 kubelet[3260]: E0904 00:53:43.399025 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399069 kubelet[3260]: W0904 00:53:43.399035 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399069 kubelet[3260]: E0904 00:53:43.399047 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399225 kubelet[3260]: E0904 00:53:43.399216 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399225 kubelet[3260]: W0904 00:53:43.399224 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399303 kubelet[3260]: E0904 00:53:43.399235 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399395 kubelet[3260]: E0904 00:53:43.399386 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399430 kubelet[3260]: W0904 00:53:43.399395 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399430 kubelet[3260]: E0904 00:53:43.399406 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399593 kubelet[3260]: E0904 00:53:43.399584 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399593 kubelet[3260]: W0904 00:53:43.399593 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399661 kubelet[3260]: E0904 00:53:43.399603 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399731 kubelet[3260]: E0904 00:53:43.399723 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399731 kubelet[3260]: W0904 00:53:43.399731 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399803 kubelet[3260]: E0904 00:53:43.399740 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399871 kubelet[3260]: E0904 00:53:43.399862 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399871 kubelet[3260]: W0904 00:53:43.399870 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.399957 kubelet[3260]: E0904 00:53:43.399892 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.399994 kubelet[3260]: E0904 00:53:43.399980 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.399994 kubelet[3260]: W0904 00:53:43.399987 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.400081 kubelet[3260]: E0904 00:53:43.400008 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.400137 kubelet[3260]: E0904 00:53:43.400124 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.400187 kubelet[3260]: W0904 00:53:43.400137 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.400187 kubelet[3260]: E0904 00:53:43.400159 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.400278 kubelet[3260]: E0904 00:53:43.400269 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.400278 kubelet[3260]: W0904 00:53:43.400277 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.400384 kubelet[3260]: E0904 00:53:43.400287 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.400474 kubelet[3260]: E0904 00:53:43.400462 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.400474 kubelet[3260]: W0904 00:53:43.400473 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.400570 kubelet[3260]: E0904 00:53:43.400488 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:43.408246 kubelet[3260]: E0904 00:53:43.408222 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:43.408246 kubelet[3260]: W0904 00:53:43.408240 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:43.408377 kubelet[3260]: E0904 00:53:43.408260 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:44.553009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1298613006.mount: Deactivated successfully. Sep 4 00:53:45.248928 containerd[1910]: time="2025-09-04T00:53:45.248903803Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:45.249163 containerd[1910]: time="2025-09-04T00:53:45.249053739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 00:53:45.249420 containerd[1910]: time="2025-09-04T00:53:45.249408391Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:45.250225 containerd[1910]: time="2025-09-04T00:53:45.250212426Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:45.250617 containerd[1910]: time="2025-09-04T00:53:45.250603936Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.218726203s" Sep 4 00:53:45.250639 containerd[1910]: time="2025-09-04T00:53:45.250621849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 00:53:45.251077 containerd[1910]: time="2025-09-04T00:53:45.251060769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 00:53:45.254107 containerd[1910]: time="2025-09-04T00:53:45.254087768Z" level=info msg="CreateContainer within sandbox \"81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 00:53:45.256759 containerd[1910]: time="2025-09-04T00:53:45.256743819Z" level=info msg="Container 217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:45.259280 containerd[1910]: time="2025-09-04T00:53:45.259243418Z" level=info msg="CreateContainer within sandbox \"81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b\"" Sep 4 00:53:45.259511 containerd[1910]: time="2025-09-04T00:53:45.259475749Z" level=info msg="StartContainer for \"217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b\"" Sep 4 00:53:45.259978 containerd[1910]: time="2025-09-04T00:53:45.259966726Z" level=info msg="connecting to shim 217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b" address="unix:///run/containerd/s/f1077874028dbd1ad5364290d509cc32a19b85cc3c288416459b1d09dc86f896" protocol=ttrpc version=3 Sep 4 00:53:45.281283 systemd[1]: Started cri-containerd-217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b.scope - libcontainer container 217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b. Sep 4 00:53:45.296925 kubelet[3260]: E0904 00:53:45.296902 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:45.310335 containerd[1910]: time="2025-09-04T00:53:45.310311361Z" level=info msg="StartContainer for \"217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b\" returns successfully" Sep 4 00:53:45.349787 kubelet[3260]: I0904 00:53:45.349743 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8d49f79fb-5gph7" podStartSLOduration=1.130447391 podStartE2EDuration="3.349729186s" podCreationTimestamp="2025-09-04 00:53:42 +0000 UTC" firstStartedPulling="2025-09-04 00:53:43.031727928 +0000 UTC m=+14.825838065" lastFinishedPulling="2025-09-04 00:53:45.251009722 +0000 UTC m=+17.045119860" observedRunningTime="2025-09-04 00:53:45.34967604 +0000 UTC m=+17.143786184" watchObservedRunningTime="2025-09-04 00:53:45.349729186 +0000 UTC m=+17.143839322" Sep 4 00:53:45.405908 kubelet[3260]: E0904 00:53:45.405840 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.405908 kubelet[3260]: W0904 00:53:45.405894 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.406269 kubelet[3260]: E0904 00:53:45.405947 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.406530 kubelet[3260]: E0904 00:53:45.406488 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.406530 kubelet[3260]: W0904 00:53:45.406526 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.406821 kubelet[3260]: E0904 00:53:45.406562 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.407044 kubelet[3260]: E0904 00:53:45.407003 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.407044 kubelet[3260]: W0904 00:53:45.407039 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.407281 kubelet[3260]: E0904 00:53:45.407070 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.407603 kubelet[3260]: E0904 00:53:45.407563 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.407603 kubelet[3260]: W0904 00:53:45.407597 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.407841 kubelet[3260]: E0904 00:53:45.407629 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.408099 kubelet[3260]: E0904 00:53:45.408064 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.408099 kubelet[3260]: W0904 00:53:45.408093 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.408341 kubelet[3260]: E0904 00:53:45.408188 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.408586 kubelet[3260]: E0904 00:53:45.408551 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.408586 kubelet[3260]: W0904 00:53:45.408579 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.408826 kubelet[3260]: E0904 00:53:45.408606 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.408949 kubelet[3260]: E0904 00:53:45.408914 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.408949 kubelet[3260]: W0904 00:53:45.408936 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.409241 kubelet[3260]: E0904 00:53:45.408958 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.409366 kubelet[3260]: E0904 00:53:45.409323 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.409366 kubelet[3260]: W0904 00:53:45.409345 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.409586 kubelet[3260]: E0904 00:53:45.409367 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.409719 kubelet[3260]: E0904 00:53:45.409694 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.409719 kubelet[3260]: W0904 00:53:45.409719 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.409928 kubelet[3260]: E0904 00:53:45.409741 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.410075 kubelet[3260]: E0904 00:53:45.410046 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.410075 kubelet[3260]: W0904 00:53:45.410069 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.410353 kubelet[3260]: E0904 00:53:45.410090 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.410486 kubelet[3260]: E0904 00:53:45.410453 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.410486 kubelet[3260]: W0904 00:53:45.410478 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.410695 kubelet[3260]: E0904 00:53:45.410501 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.410917 kubelet[3260]: E0904 00:53:45.410891 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.410917 kubelet[3260]: W0904 00:53:45.410913 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.411172 kubelet[3260]: E0904 00:53:45.410934 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.411317 kubelet[3260]: E0904 00:53:45.411290 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.411317 kubelet[3260]: W0904 00:53:45.411313 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.411518 kubelet[3260]: E0904 00:53:45.411334 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.411675 kubelet[3260]: E0904 00:53:45.411649 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.411675 kubelet[3260]: W0904 00:53:45.411672 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.411895 kubelet[3260]: E0904 00:53:45.411694 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.412091 kubelet[3260]: E0904 00:53:45.412063 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.412091 kubelet[3260]: W0904 00:53:45.412086 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.412349 kubelet[3260]: E0904 00:53:45.412107 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.412720 kubelet[3260]: E0904 00:53:45.412686 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.412818 kubelet[3260]: W0904 00:53:45.412719 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.412818 kubelet[3260]: E0904 00:53:45.412752 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.413221 kubelet[3260]: E0904 00:53:45.413190 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.413221 kubelet[3260]: W0904 00:53:45.413219 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.413448 kubelet[3260]: E0904 00:53:45.413247 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.413700 kubelet[3260]: E0904 00:53:45.413672 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.413700 kubelet[3260]: W0904 00:53:45.413697 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.413882 kubelet[3260]: E0904 00:53:45.413728 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.414334 kubelet[3260]: E0904 00:53:45.414255 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.414334 kubelet[3260]: W0904 00:53:45.414292 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.414334 kubelet[3260]: E0904 00:53:45.414330 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.414783 kubelet[3260]: E0904 00:53:45.414750 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.414783 kubelet[3260]: W0904 00:53:45.414778 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.415068 kubelet[3260]: E0904 00:53:45.414884 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.415192 kubelet[3260]: E0904 00:53:45.415161 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.415192 kubelet[3260]: W0904 00:53:45.415184 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.415409 kubelet[3260]: E0904 00:53:45.415281 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.415647 kubelet[3260]: E0904 00:53:45.415615 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.415790 kubelet[3260]: W0904 00:53:45.415648 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.415790 kubelet[3260]: E0904 00:53:45.415734 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.416087 kubelet[3260]: E0904 00:53:45.416056 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.416087 kubelet[3260]: W0904 00:53:45.416081 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.416310 kubelet[3260]: E0904 00:53:45.416141 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.416709 kubelet[3260]: E0904 00:53:45.416653 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.416709 kubelet[3260]: W0904 00:53:45.416698 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.417049 kubelet[3260]: E0904 00:53:45.416749 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.417341 kubelet[3260]: E0904 00:53:45.417299 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.417470 kubelet[3260]: W0904 00:53:45.417343 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.417470 kubelet[3260]: E0904 00:53:45.417401 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.417816 kubelet[3260]: E0904 00:53:45.417784 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.417816 kubelet[3260]: W0904 00:53:45.417809 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.418100 kubelet[3260]: E0904 00:53:45.417881 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.418278 kubelet[3260]: E0904 00:53:45.418242 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.418278 kubelet[3260]: W0904 00:53:45.418275 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.418512 kubelet[3260]: E0904 00:53:45.418346 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.418642 kubelet[3260]: E0904 00:53:45.418617 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.418642 kubelet[3260]: W0904 00:53:45.418638 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.418851 kubelet[3260]: E0904 00:53:45.418718 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.419039 kubelet[3260]: E0904 00:53:45.419012 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.419039 kubelet[3260]: W0904 00:53:45.419035 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.419254 kubelet[3260]: E0904 00:53:45.419066 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.419715 kubelet[3260]: E0904 00:53:45.419676 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.419868 kubelet[3260]: W0904 00:53:45.419718 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.419868 kubelet[3260]: E0904 00:53:45.419764 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.420161 kubelet[3260]: E0904 00:53:45.420104 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.420285 kubelet[3260]: W0904 00:53:45.420162 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.420285 kubelet[3260]: E0904 00:53:45.420209 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.420648 kubelet[3260]: E0904 00:53:45.420620 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.420773 kubelet[3260]: W0904 00:53:45.420648 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.420873 kubelet[3260]: E0904 00:53:45.420758 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:45.421030 kubelet[3260]: E0904 00:53:45.421002 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:45.421136 kubelet[3260]: W0904 00:53:45.421030 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:45.421136 kubelet[3260]: E0904 00:53:45.421056 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.347743 kubelet[3260]: I0904 00:53:46.347656 3260 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 00:53:46.419657 kubelet[3260]: E0904 00:53:46.419599 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.419657 kubelet[3260]: W0904 00:53:46.419650 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.420008 kubelet[3260]: E0904 00:53:46.419693 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.420419 kubelet[3260]: E0904 00:53:46.420376 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.420419 kubelet[3260]: W0904 00:53:46.420416 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.420686 kubelet[3260]: E0904 00:53:46.420451 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.420941 kubelet[3260]: E0904 00:53:46.420907 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.420941 kubelet[3260]: W0904 00:53:46.420938 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.421184 kubelet[3260]: E0904 00:53:46.420967 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.421433 kubelet[3260]: E0904 00:53:46.421402 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.421560 kubelet[3260]: W0904 00:53:46.421433 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.421560 kubelet[3260]: E0904 00:53:46.421461 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.421923 kubelet[3260]: E0904 00:53:46.421893 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.422025 kubelet[3260]: W0904 00:53:46.421921 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.422025 kubelet[3260]: E0904 00:53:46.421948 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.422439 kubelet[3260]: E0904 00:53:46.422390 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.422439 kubelet[3260]: W0904 00:53:46.422418 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.422655 kubelet[3260]: E0904 00:53:46.422443 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.422931 kubelet[3260]: E0904 00:53:46.422875 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.422931 kubelet[3260]: W0904 00:53:46.422902 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.422931 kubelet[3260]: E0904 00:53:46.422926 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.423377 kubelet[3260]: E0904 00:53:46.423326 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.423377 kubelet[3260]: W0904 00:53:46.423355 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.423377 kubelet[3260]: E0904 00:53:46.423379 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.423876 kubelet[3260]: E0904 00:53:46.423827 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.423876 kubelet[3260]: W0904 00:53:46.423853 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.424093 kubelet[3260]: E0904 00:53:46.423878 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.424359 kubelet[3260]: E0904 00:53:46.424299 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.424359 kubelet[3260]: W0904 00:53:46.424329 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.424359 kubelet[3260]: E0904 00:53:46.424358 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.424830 kubelet[3260]: E0904 00:53:46.424799 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.424940 kubelet[3260]: W0904 00:53:46.424829 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.424940 kubelet[3260]: E0904 00:53:46.424857 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.425420 kubelet[3260]: E0904 00:53:46.425358 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.425420 kubelet[3260]: W0904 00:53:46.425385 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.425420 kubelet[3260]: E0904 00:53:46.425410 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.426015 kubelet[3260]: E0904 00:53:46.425944 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.426015 kubelet[3260]: W0904 00:53:46.425972 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.426015 kubelet[3260]: E0904 00:53:46.425995 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.426439 kubelet[3260]: E0904 00:53:46.426384 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.426439 kubelet[3260]: W0904 00:53:46.426413 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.426439 kubelet[3260]: E0904 00:53:46.426437 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.426901 kubelet[3260]: E0904 00:53:46.426844 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.426901 kubelet[3260]: W0904 00:53:46.426875 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.426901 kubelet[3260]: E0904 00:53:46.426900 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.427542 kubelet[3260]: E0904 00:53:46.427482 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.427542 kubelet[3260]: W0904 00:53:46.427509 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.427542 kubelet[3260]: E0904 00:53:46.427536 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.428012 kubelet[3260]: E0904 00:53:46.427961 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.428012 kubelet[3260]: W0904 00:53:46.427989 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.428303 kubelet[3260]: E0904 00:53:46.428021 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.428672 kubelet[3260]: E0904 00:53:46.428612 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.428672 kubelet[3260]: W0904 00:53:46.428652 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.428926 kubelet[3260]: E0904 00:53:46.428699 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.429198 kubelet[3260]: E0904 00:53:46.429148 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.429198 kubelet[3260]: W0904 00:53:46.429189 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.429474 kubelet[3260]: E0904 00:53:46.429229 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.429728 kubelet[3260]: E0904 00:53:46.429675 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.429728 kubelet[3260]: W0904 00:53:46.429704 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.430024 kubelet[3260]: E0904 00:53:46.429782 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.430184 kubelet[3260]: E0904 00:53:46.430067 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.430184 kubelet[3260]: W0904 00:53:46.430089 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.430484 kubelet[3260]: E0904 00:53:46.430188 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.430484 kubelet[3260]: E0904 00:53:46.430468 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.430748 kubelet[3260]: W0904 00:53:46.430490 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.430748 kubelet[3260]: E0904 00:53:46.430568 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.430982 kubelet[3260]: E0904 00:53:46.430909 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.430982 kubelet[3260]: W0904 00:53:46.430932 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.430982 kubelet[3260]: E0904 00:53:46.430964 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.431582 kubelet[3260]: E0904 00:53:46.431524 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.431582 kubelet[3260]: W0904 00:53:46.431555 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.431859 kubelet[3260]: E0904 00:53:46.431595 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.432011 kubelet[3260]: E0904 00:53:46.431976 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.432165 kubelet[3260]: W0904 00:53:46.432009 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.432165 kubelet[3260]: E0904 00:53:46.432059 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.432496 kubelet[3260]: E0904 00:53:46.432459 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.432496 kubelet[3260]: W0904 00:53:46.432484 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.432746 kubelet[3260]: E0904 00:53:46.432544 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.432931 kubelet[3260]: E0904 00:53:46.432893 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.432931 kubelet[3260]: W0904 00:53:46.432916 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.433205 kubelet[3260]: E0904 00:53:46.432992 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.433343 kubelet[3260]: E0904 00:53:46.433317 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.433343 kubelet[3260]: W0904 00:53:46.433340 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.433544 kubelet[3260]: E0904 00:53:46.433372 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.433910 kubelet[3260]: E0904 00:53:46.433852 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.433910 kubelet[3260]: W0904 00:53:46.433877 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.433910 kubelet[3260]: E0904 00:53:46.433909 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.434523 kubelet[3260]: E0904 00:53:46.434472 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.434523 kubelet[3260]: W0904 00:53:46.434516 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.434812 kubelet[3260]: E0904 00:53:46.434566 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.435022 kubelet[3260]: E0904 00:53:46.434990 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.435022 kubelet[3260]: W0904 00:53:46.435016 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.435287 kubelet[3260]: E0904 00:53:46.435049 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.435642 kubelet[3260]: E0904 00:53:46.435582 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.435642 kubelet[3260]: W0904 00:53:46.435610 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.435642 kubelet[3260]: E0904 00:53:46.435641 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.436510 kubelet[3260]: E0904 00:53:46.436455 3260 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 00:53:46.436510 kubelet[3260]: W0904 00:53:46.436480 3260 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 00:53:46.436510 kubelet[3260]: E0904 00:53:46.436506 3260 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 00:53:46.815778 containerd[1910]: time="2025-09-04T00:53:46.815731840Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:46.815989 containerd[1910]: time="2025-09-04T00:53:46.815908319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 00:53:46.816296 containerd[1910]: time="2025-09-04T00:53:46.816255879Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:46.817089 containerd[1910]: time="2025-09-04T00:53:46.817078593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:46.817478 containerd[1910]: time="2025-09-04T00:53:46.817436986Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.56636228s" Sep 4 00:53:46.817478 containerd[1910]: time="2025-09-04T00:53:46.817452897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 00:53:46.818344 containerd[1910]: time="2025-09-04T00:53:46.818333801Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 00:53:46.821620 containerd[1910]: time="2025-09-04T00:53:46.821580734Z" level=info msg="Container 4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:46.824597 containerd[1910]: time="2025-09-04T00:53:46.824559670Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\"" Sep 4 00:53:46.824813 containerd[1910]: time="2025-09-04T00:53:46.824771923Z" level=info msg="StartContainer for \"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\"" Sep 4 00:53:46.825493 containerd[1910]: time="2025-09-04T00:53:46.825456188Z" level=info msg="connecting to shim 4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc" address="unix:///run/containerd/s/e728383f39ee63edb039e5398255863bf36fc982b928585dcdd9ce530fc57c41" protocol=ttrpc version=3 Sep 4 00:53:46.840272 systemd[1]: Started cri-containerd-4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc.scope - libcontainer container 4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc. Sep 4 00:53:46.857835 containerd[1910]: time="2025-09-04T00:53:46.857811955Z" level=info msg="StartContainer for \"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\" returns successfully" Sep 4 00:53:46.861514 systemd[1]: cri-containerd-4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc.scope: Deactivated successfully. Sep 4 00:53:46.862676 containerd[1910]: time="2025-09-04T00:53:46.862657882Z" level=info msg="received exit event container_id:\"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\" id:\"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\" pid:4117 exited_at:{seconds:1756947226 nanos:862456806}" Sep 4 00:53:46.862740 containerd[1910]: time="2025-09-04T00:53:46.862663743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\" id:\"4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc\" pid:4117 exited_at:{seconds:1756947226 nanos:862456806}" Sep 4 00:53:46.873343 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc-rootfs.mount: Deactivated successfully. Sep 4 00:53:47.297076 kubelet[3260]: E0904 00:53:47.296953 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:48.365541 containerd[1910]: time="2025-09-04T00:53:48.365470611Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 00:53:48.459325 kubelet[3260]: I0904 00:53:48.459258 3260 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 00:53:49.297513 kubelet[3260]: E0904 00:53:49.297424 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:51.296903 kubelet[3260]: E0904 00:53:51.296848 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:51.418715 containerd[1910]: time="2025-09-04T00:53:51.418665585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:51.418919 containerd[1910]: time="2025-09-04T00:53:51.418895775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 00:53:51.419213 containerd[1910]: time="2025-09-04T00:53:51.419201524Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:51.420037 containerd[1910]: time="2025-09-04T00:53:51.420026814Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:51.420391 containerd[1910]: time="2025-09-04T00:53:51.420380770Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.054845216s" Sep 4 00:53:51.420420 containerd[1910]: time="2025-09-04T00:53:51.420394378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 00:53:51.421350 containerd[1910]: time="2025-09-04T00:53:51.421338591Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 00:53:51.424675 containerd[1910]: time="2025-09-04T00:53:51.424661604Z" level=info msg="Container dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:51.428564 containerd[1910]: time="2025-09-04T00:53:51.428548793Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\"" Sep 4 00:53:51.428794 containerd[1910]: time="2025-09-04T00:53:51.428779896Z" level=info msg="StartContainer for \"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\"" Sep 4 00:53:51.429494 containerd[1910]: time="2025-09-04T00:53:51.429482624Z" level=info msg="connecting to shim dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f" address="unix:///run/containerd/s/e728383f39ee63edb039e5398255863bf36fc982b928585dcdd9ce530fc57c41" protocol=ttrpc version=3 Sep 4 00:53:51.445440 systemd[1]: Started cri-containerd-dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f.scope - libcontainer container dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f. Sep 4 00:53:51.500133 containerd[1910]: time="2025-09-04T00:53:51.500085382Z" level=info msg="StartContainer for \"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\" returns successfully" Sep 4 00:53:52.107133 containerd[1910]: time="2025-09-04T00:53:52.107099340Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 00:53:52.108505 systemd[1]: cri-containerd-dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f.scope: Deactivated successfully. Sep 4 00:53:52.108709 systemd[1]: cri-containerd-dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f.scope: Consumed 411ms CPU time, 194.6M memory peak, 171.3M written to disk. Sep 4 00:53:52.109656 containerd[1910]: time="2025-09-04T00:53:52.109639343Z" level=info msg="received exit event container_id:\"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\" id:\"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\" pid:4178 exited_at:{seconds:1756947232 nanos:109526529}" Sep 4 00:53:52.109722 containerd[1910]: time="2025-09-04T00:53:52.109704979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\" id:\"dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f\" pid:4178 exited_at:{seconds:1756947232 nanos:109526529}" Sep 4 00:53:52.124968 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f-rootfs.mount: Deactivated successfully. Sep 4 00:53:52.207769 kubelet[3260]: I0904 00:53:52.207689 3260 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 4 00:53:52.262128 systemd[1]: Created slice kubepods-burstable-pod415fc8d0_9cb1_41e0_a4cc_84e4984d8f32.slice - libcontainer container kubepods-burstable-pod415fc8d0_9cb1_41e0_a4cc_84e4984d8f32.slice. Sep 4 00:53:52.269831 systemd[1]: Created slice kubepods-burstable-pod3a4824fe_ec50_448d_ad3a_edd8a6c76035.slice - libcontainer container kubepods-burstable-pod3a4824fe_ec50_448d_ad3a_edd8a6c76035.slice. Sep 4 00:53:52.274972 kubelet[3260]: I0904 00:53:52.274790 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mzv\" (UniqueName: \"kubernetes.io/projected/415fc8d0-9cb1-41e0-a4cc-84e4984d8f32-kube-api-access-q7mzv\") pod \"coredns-7c65d6cfc9-p69xf\" (UID: \"415fc8d0-9cb1-41e0-a4cc-84e4984d8f32\") " pod="kube-system/coredns-7c65d6cfc9-p69xf" Sep 4 00:53:52.274972 kubelet[3260]: I0904 00:53:52.274860 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a4824fe-ec50-448d-ad3a-edd8a6c76035-config-volume\") pod \"coredns-7c65d6cfc9-g7ckk\" (UID: \"3a4824fe-ec50-448d-ad3a-edd8a6c76035\") " pod="kube-system/coredns-7c65d6cfc9-g7ckk" Sep 4 00:53:52.274972 kubelet[3260]: I0904 00:53:52.274896 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6b06bc34-c2ce-453b-b38a-15c95898d6db-calico-apiserver-certs\") pod \"calico-apiserver-6476566667-n6xq7\" (UID: \"6b06bc34-c2ce-453b-b38a-15c95898d6db\") " pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" Sep 4 00:53:52.274972 kubelet[3260]: I0904 00:53:52.274936 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx9j\" (UniqueName: \"kubernetes.io/projected/15b7b5a8-04c0-4edb-9d50-0a666c7b0278-kube-api-access-5sx9j\") pod \"calico-kube-controllers-57898dccd4-4fmwz\" (UID: \"15b7b5a8-04c0-4edb-9d50-0a666c7b0278\") " pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" Sep 4 00:53:52.274954 systemd[1]: Created slice kubepods-besteffort-pod15b7b5a8_04c0_4edb_9d50_0a666c7b0278.slice - libcontainer container kubepods-besteffort-pod15b7b5a8_04c0_4edb_9d50_0a666c7b0278.slice. Sep 4 00:53:52.275427 kubelet[3260]: I0904 00:53:52.274987 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-backend-key-pair\") pod \"whisker-7899bcf645-77txp\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " pod="calico-system/whisker-7899bcf645-77txp" Sep 4 00:53:52.275427 kubelet[3260]: I0904 00:53:52.275020 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dcc\" (UniqueName: \"kubernetes.io/projected/a488579f-53d3-4d0a-bc28-666d1db626ef-kube-api-access-92dcc\") pod \"goldmane-7988f88666-x6zjt\" (UID: \"a488579f-53d3-4d0a-bc28-666d1db626ef\") " pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.275427 kubelet[3260]: I0904 00:53:52.275050 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hpv6\" (UniqueName: \"kubernetes.io/projected/31036ac4-9f26-4932-a118-c6aacd0fb4f5-kube-api-access-5hpv6\") pod \"calico-apiserver-6476566667-zmqjl\" (UID: \"31036ac4-9f26-4932-a118-c6aacd0fb4f5\") " pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" Sep 4 00:53:52.275427 kubelet[3260]: I0904 00:53:52.275084 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15b7b5a8-04c0-4edb-9d50-0a666c7b0278-tigera-ca-bundle\") pod \"calico-kube-controllers-57898dccd4-4fmwz\" (UID: \"15b7b5a8-04c0-4edb-9d50-0a666c7b0278\") " pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" Sep 4 00:53:52.275427 kubelet[3260]: I0904 00:53:52.275125 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5z7s\" (UniqueName: \"kubernetes.io/projected/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-kube-api-access-d5z7s\") pod \"whisker-7899bcf645-77txp\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " pod="calico-system/whisker-7899bcf645-77txp" Sep 4 00:53:52.275658 kubelet[3260]: I0904 00:53:52.275156 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a488579f-53d3-4d0a-bc28-666d1db626ef-goldmane-ca-bundle\") pod \"goldmane-7988f88666-x6zjt\" (UID: \"a488579f-53d3-4d0a-bc28-666d1db626ef\") " pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.275658 kubelet[3260]: I0904 00:53:52.275189 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpkq\" (UniqueName: \"kubernetes.io/projected/6b06bc34-c2ce-453b-b38a-15c95898d6db-kube-api-access-2jpkq\") pod \"calico-apiserver-6476566667-n6xq7\" (UID: \"6b06bc34-c2ce-453b-b38a-15c95898d6db\") " pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" Sep 4 00:53:52.275658 kubelet[3260]: I0904 00:53:52.275219 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a488579f-53d3-4d0a-bc28-666d1db626ef-config\") pod \"goldmane-7988f88666-x6zjt\" (UID: \"a488579f-53d3-4d0a-bc28-666d1db626ef\") " pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.275658 kubelet[3260]: I0904 00:53:52.275246 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a488579f-53d3-4d0a-bc28-666d1db626ef-goldmane-key-pair\") pod \"goldmane-7988f88666-x6zjt\" (UID: \"a488579f-53d3-4d0a-bc28-666d1db626ef\") " pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.275658 kubelet[3260]: I0904 00:53:52.275278 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/415fc8d0-9cb1-41e0-a4cc-84e4984d8f32-config-volume\") pod \"coredns-7c65d6cfc9-p69xf\" (UID: \"415fc8d0-9cb1-41e0-a4cc-84e4984d8f32\") " pod="kube-system/coredns-7c65d6cfc9-p69xf" Sep 4 00:53:52.275816 kubelet[3260]: I0904 00:53:52.275308 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcfv\" (UniqueName: \"kubernetes.io/projected/3a4824fe-ec50-448d-ad3a-edd8a6c76035-kube-api-access-6vcfv\") pod \"coredns-7c65d6cfc9-g7ckk\" (UID: \"3a4824fe-ec50-448d-ad3a-edd8a6c76035\") " pod="kube-system/coredns-7c65d6cfc9-g7ckk" Sep 4 00:53:52.275816 kubelet[3260]: I0904 00:53:52.275341 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-ca-bundle\") pod \"whisker-7899bcf645-77txp\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " pod="calico-system/whisker-7899bcf645-77txp" Sep 4 00:53:52.275816 kubelet[3260]: I0904 00:53:52.275358 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/31036ac4-9f26-4932-a118-c6aacd0fb4f5-calico-apiserver-certs\") pod \"calico-apiserver-6476566667-zmqjl\" (UID: \"31036ac4-9f26-4932-a118-c6aacd0fb4f5\") " pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" Sep 4 00:53:52.280832 systemd[1]: Created slice kubepods-besteffort-pod6b06bc34_c2ce_453b_b38a_15c95898d6db.slice - libcontainer container kubepods-besteffort-pod6b06bc34_c2ce_453b_b38a_15c95898d6db.slice. Sep 4 00:53:52.286087 systemd[1]: Created slice kubepods-besteffort-pod31036ac4_9f26_4932_a118_c6aacd0fb4f5.slice - libcontainer container kubepods-besteffort-pod31036ac4_9f26_4932_a118_c6aacd0fb4f5.slice. Sep 4 00:53:52.291274 systemd[1]: Created slice kubepods-besteffort-pod7ce45a83_2ee7_41d1_b7bb_989f365f5e5b.slice - libcontainer container kubepods-besteffort-pod7ce45a83_2ee7_41d1_b7bb_989f365f5e5b.slice. Sep 4 00:53:52.296263 systemd[1]: Created slice kubepods-besteffort-poda488579f_53d3_4d0a_bc28_666d1db626ef.slice - libcontainer container kubepods-besteffort-poda488579f_53d3_4d0a_bc28_666d1db626ef.slice. Sep 4 00:53:52.567791 containerd[1910]: time="2025-09-04T00:53:52.567707990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-p69xf,Uid:415fc8d0-9cb1-41e0-a4cc-84e4984d8f32,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:52.572397 containerd[1910]: time="2025-09-04T00:53:52.572379902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-g7ckk,Uid:3a4824fe-ec50-448d-ad3a-edd8a6c76035,Namespace:kube-system,Attempt:0,}" Sep 4 00:53:52.577984 containerd[1910]: time="2025-09-04T00:53:52.577962774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57898dccd4-4fmwz,Uid:15b7b5a8-04c0-4edb-9d50-0a666c7b0278,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:52.584530 containerd[1910]: time="2025-09-04T00:53:52.584501396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-n6xq7,Uid:6b06bc34-c2ce-453b-b38a-15c95898d6db,Namespace:calico-apiserver,Attempt:0,}" Sep 4 00:53:52.589108 containerd[1910]: time="2025-09-04T00:53:52.589082146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-zmqjl,Uid:31036ac4-9f26-4932-a118-c6aacd0fb4f5,Namespace:calico-apiserver,Attempt:0,}" Sep 4 00:53:52.594595 containerd[1910]: time="2025-09-04T00:53:52.594557556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7899bcf645-77txp,Uid:7ce45a83-2ee7-41d1-b7bb-989f365f5e5b,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:52.595015 containerd[1910]: time="2025-09-04T00:53:52.594990321Z" level=error msg="Failed to destroy network for sandbox \"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.595436 containerd[1910]: time="2025-09-04T00:53:52.595419349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-p69xf,Uid:415fc8d0-9cb1-41e0-a4cc-84e4984d8f32,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.595566 kubelet[3260]: E0904 00:53:52.595535 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.595809 kubelet[3260]: E0904 00:53:52.595585 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-p69xf" Sep 4 00:53:52.595809 kubelet[3260]: E0904 00:53:52.595601 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-p69xf" Sep 4 00:53:52.595809 kubelet[3260]: E0904 00:53:52.595629 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-p69xf_kube-system(415fc8d0-9cb1-41e0-a4cc-84e4984d8f32)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-p69xf_kube-system(415fc8d0-9cb1-41e0-a4cc-84e4984d8f32)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6defaea446a531d800cdbd675244717b41504cdc39b8c8a66db94c92b15a0a92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-p69xf" podUID="415fc8d0-9cb1-41e0-a4cc-84e4984d8f32" Sep 4 00:53:52.597584 containerd[1910]: time="2025-09-04T00:53:52.597555046Z" level=error msg="Failed to destroy network for sandbox \"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.598009 containerd[1910]: time="2025-09-04T00:53:52.597980737Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-g7ckk,Uid:3a4824fe-ec50-448d-ad3a-edd8a6c76035,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.598193 kubelet[3260]: E0904 00:53:52.598166 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.598238 kubelet[3260]: E0904 00:53:52.598211 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-g7ckk" Sep 4 00:53:52.598238 kubelet[3260]: E0904 00:53:52.598224 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-g7ckk" Sep 4 00:53:52.598279 kubelet[3260]: E0904 00:53:52.598255 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-g7ckk_kube-system(3a4824fe-ec50-448d-ad3a-edd8a6c76035)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-g7ckk_kube-system(3a4824fe-ec50-448d-ad3a-edd8a6c76035)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f9d944b795d05ec4bf6c73eb231dfc149ccf79ad556fa7f9c1b770a7d15bede\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-g7ckk" podUID="3a4824fe-ec50-448d-ad3a-edd8a6c76035" Sep 4 00:53:52.598617 containerd[1910]: time="2025-09-04T00:53:52.598597376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x6zjt,Uid:a488579f-53d3-4d0a-bc28-666d1db626ef,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:52.604910 containerd[1910]: time="2025-09-04T00:53:52.604850240Z" level=error msg="Failed to destroy network for sandbox \"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.605480 containerd[1910]: time="2025-09-04T00:53:52.605452275Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57898dccd4-4fmwz,Uid:15b7b5a8-04c0-4edb-9d50-0a666c7b0278,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.605631 kubelet[3260]: E0904 00:53:52.605609 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.605673 kubelet[3260]: E0904 00:53:52.605649 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" Sep 4 00:53:52.605673 kubelet[3260]: E0904 00:53:52.605664 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" Sep 4 00:53:52.605715 kubelet[3260]: E0904 00:53:52.605693 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-57898dccd4-4fmwz_calico-system(15b7b5a8-04c0-4edb-9d50-0a666c7b0278)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-57898dccd4-4fmwz_calico-system(15b7b5a8-04c0-4edb-9d50-0a666c7b0278)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"31a0bbe7424b0607b81b3d5d91b3806ad842ef1a0d53dacdaa5786429201c666\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" podUID="15b7b5a8-04c0-4edb-9d50-0a666c7b0278" Sep 4 00:53:52.611338 containerd[1910]: time="2025-09-04T00:53:52.611307586Z" level=error msg="Failed to destroy network for sandbox \"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.611783 containerd[1910]: time="2025-09-04T00:53:52.611761348Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-n6xq7,Uid:6b06bc34-c2ce-453b-b38a-15c95898d6db,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.611939 kubelet[3260]: E0904 00:53:52.611911 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.611986 kubelet[3260]: E0904 00:53:52.611957 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" Sep 4 00:53:52.611986 kubelet[3260]: E0904 00:53:52.611974 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" Sep 4 00:53:52.612067 kubelet[3260]: E0904 00:53:52.612008 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6476566667-n6xq7_calico-apiserver(6b06bc34-c2ce-453b-b38a-15c95898d6db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6476566667-n6xq7_calico-apiserver(6b06bc34-c2ce-453b-b38a-15c95898d6db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8b431abb2d2f3e9fe35e7751b213a72804d9d38b06288ba8a3a6e4cf2f768da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" podUID="6b06bc34-c2ce-453b-b38a-15c95898d6db" Sep 4 00:53:52.617077 containerd[1910]: time="2025-09-04T00:53:52.617046051Z" level=error msg="Failed to destroy network for sandbox \"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.617483 containerd[1910]: time="2025-09-04T00:53:52.617465050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-zmqjl,Uid:31036ac4-9f26-4932-a118-c6aacd0fb4f5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.617623 kubelet[3260]: E0904 00:53:52.617601 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.617656 kubelet[3260]: E0904 00:53:52.617639 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" Sep 4 00:53:52.617656 kubelet[3260]: E0904 00:53:52.617652 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" Sep 4 00:53:52.617697 kubelet[3260]: E0904 00:53:52.617676 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6476566667-zmqjl_calico-apiserver(31036ac4-9f26-4932-a118-c6aacd0fb4f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6476566667-zmqjl_calico-apiserver(31036ac4-9f26-4932-a118-c6aacd0fb4f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce856940068877d2bce86fb4a15ccbc84dbfe1dc546429d11928fac06e581165\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" podUID="31036ac4-9f26-4932-a118-c6aacd0fb4f5" Sep 4 00:53:52.622500 containerd[1910]: time="2025-09-04T00:53:52.622471126Z" level=error msg="Failed to destroy network for sandbox \"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.622883 containerd[1910]: time="2025-09-04T00:53:52.622866446Z" level=error msg="Failed to destroy network for sandbox \"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.622989 containerd[1910]: time="2025-09-04T00:53:52.622976013Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7899bcf645-77txp,Uid:7ce45a83-2ee7-41d1-b7bb-989f365f5e5b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.623114 kubelet[3260]: E0904 00:53:52.623088 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.623144 kubelet[3260]: E0904 00:53:52.623136 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7899bcf645-77txp" Sep 4 00:53:52.623167 kubelet[3260]: E0904 00:53:52.623149 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7899bcf645-77txp" Sep 4 00:53:52.623187 kubelet[3260]: E0904 00:53:52.623176 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7899bcf645-77txp_calico-system(7ce45a83-2ee7-41d1-b7bb-989f365f5e5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7899bcf645-77txp_calico-system(7ce45a83-2ee7-41d1-b7bb-989f365f5e5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"492caad060c5e1f5c5f1adb9e0b7e96a92297c59904b5c5857ba419bf258acb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7899bcf645-77txp" podUID="7ce45a83-2ee7-41d1-b7bb-989f365f5e5b" Sep 4 00:53:52.623256 containerd[1910]: time="2025-09-04T00:53:52.623241603Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x6zjt,Uid:a488579f-53d3-4d0a-bc28-666d1db626ef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.623324 kubelet[3260]: E0904 00:53:52.623312 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:52.623348 kubelet[3260]: E0904 00:53:52.623333 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.623348 kubelet[3260]: E0904 00:53:52.623344 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-x6zjt" Sep 4 00:53:52.623387 kubelet[3260]: E0904 00:53:52.623361 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-x6zjt_calico-system(a488579f-53d3-4d0a-bc28-666d1db626ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-x6zjt_calico-system(a488579f-53d3-4d0a-bc28-666d1db626ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae3dc91290120fda7094dc34d6f3a8dfddadb7494789e0086b216f9eb825680e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-x6zjt" podUID="a488579f-53d3-4d0a-bc28-666d1db626ef" Sep 4 00:53:53.312983 systemd[1]: Created slice kubepods-besteffort-poddf69562f_87c3_42fe_a794_4eb1c96d7d52.slice - libcontainer container kubepods-besteffort-poddf69562f_87c3_42fe_a794_4eb1c96d7d52.slice. Sep 4 00:53:53.314487 containerd[1910]: time="2025-09-04T00:53:53.314446901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89q69,Uid:df69562f-87c3-42fe-a794-4eb1c96d7d52,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:53.342273 containerd[1910]: time="2025-09-04T00:53:53.342216044Z" level=error msg="Failed to destroy network for sandbox \"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:53.342951 containerd[1910]: time="2025-09-04T00:53:53.342871926Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89q69,Uid:df69562f-87c3-42fe-a794-4eb1c96d7d52,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:53.343056 kubelet[3260]: E0904 00:53:53.343034 3260 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 00:53:53.343087 kubelet[3260]: E0904 00:53:53.343074 3260 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:53.343105 kubelet[3260]: E0904 00:53:53.343087 3260 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-89q69" Sep 4 00:53:53.343156 kubelet[3260]: E0904 00:53:53.343120 3260 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-89q69_calico-system(df69562f-87c3-42fe-a794-4eb1c96d7d52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-89q69_calico-system(df69562f-87c3-42fe-a794-4eb1c96d7d52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"23a9abb0df11b7c91fe3f5d5b12d2024a466668cbe42d1bdf8b21af314075f4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-89q69" podUID="df69562f-87c3-42fe-a794-4eb1c96d7d52" Sep 4 00:53:53.394214 containerd[1910]: time="2025-09-04T00:53:53.394141979Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 00:53:53.431157 systemd[1]: run-netns-cni\x2d79bcc4a9\x2d6248\x2d0ce8\x2dc9f4\x2d1cf4fb783dfc.mount: Deactivated successfully. Sep 4 00:53:53.431425 systemd[1]: run-netns-cni\x2def668994\x2d9cbd\x2d126a\x2d83e2\x2de13d1e926375.mount: Deactivated successfully. Sep 4 00:53:53.431632 systemd[1]: run-netns-cni\x2d09cbb3a0\x2df73c\x2de645\x2dce49\x2d54934879b873.mount: Deactivated successfully. Sep 4 00:53:53.431815 systemd[1]: run-netns-cni\x2d5dc11dc1\x2d0ee1\x2d2ff5\x2de8e4\x2dd6eda2cf77c6.mount: Deactivated successfully. Sep 4 00:53:58.309916 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount88093471.mount: Deactivated successfully. Sep 4 00:53:58.326977 containerd[1910]: time="2025-09-04T00:53:58.326956727Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:58.327180 containerd[1910]: time="2025-09-04T00:53:58.327138605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 00:53:58.327567 containerd[1910]: time="2025-09-04T00:53:58.327553267Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:58.328263 containerd[1910]: time="2025-09-04T00:53:58.328251290Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:53:58.328602 containerd[1910]: time="2025-09-04T00:53:58.328589380Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 4.934382528s" Sep 4 00:53:58.328633 containerd[1910]: time="2025-09-04T00:53:58.328604833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 00:53:58.332368 containerd[1910]: time="2025-09-04T00:53:58.332348819Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 00:53:58.335765 containerd[1910]: time="2025-09-04T00:53:58.335724379Z" level=info msg="Container 4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:53:58.339654 containerd[1910]: time="2025-09-04T00:53:58.339642182Z" level=info msg="CreateContainer within sandbox \"1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\"" Sep 4 00:53:58.339904 containerd[1910]: time="2025-09-04T00:53:58.339892981Z" level=info msg="StartContainer for \"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\"" Sep 4 00:53:58.340687 containerd[1910]: time="2025-09-04T00:53:58.340648557Z" level=info msg="connecting to shim 4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad" address="unix:///run/containerd/s/e728383f39ee63edb039e5398255863bf36fc982b928585dcdd9ce530fc57c41" protocol=ttrpc version=3 Sep 4 00:53:58.359291 systemd[1]: Started cri-containerd-4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad.scope - libcontainer container 4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad. Sep 4 00:53:58.386896 containerd[1910]: time="2025-09-04T00:53:58.386870802Z" level=info msg="StartContainer for \"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" returns successfully" Sep 4 00:53:58.418121 kubelet[3260]: I0904 00:53:58.418086 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6qrmc" podStartSLOduration=1.431485818 podStartE2EDuration="16.418075618s" podCreationTimestamp="2025-09-04 00:53:42 +0000 UTC" firstStartedPulling="2025-09-04 00:53:43.342755773 +0000 UTC m=+15.136865909" lastFinishedPulling="2025-09-04 00:53:58.329345567 +0000 UTC m=+30.123455709" observedRunningTime="2025-09-04 00:53:58.417741317 +0000 UTC m=+30.211851456" watchObservedRunningTime="2025-09-04 00:53:58.418075618 +0000 UTC m=+30.212185752" Sep 4 00:53:58.447887 containerd[1910]: time="2025-09-04T00:53:58.447862440Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"403f2ef2805413af8c94d731b4041dc226ea7f747cd75e8eeb85fea261c56111\" pid:4672 exit_status:1 exited_at:{seconds:1756947238 nanos:447674897}" Sep 4 00:53:58.449020 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 00:53:58.449053 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 00:53:58.619032 kubelet[3260]: I0904 00:53:58.618878 3260 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-ca-bundle\") pod \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " Sep 4 00:53:58.619032 kubelet[3260]: I0904 00:53:58.618979 3260 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-backend-key-pair\") pod \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " Sep 4 00:53:58.619406 kubelet[3260]: I0904 00:53:58.619069 3260 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5z7s\" (UniqueName: \"kubernetes.io/projected/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-kube-api-access-d5z7s\") pod \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\" (UID: \"7ce45a83-2ee7-41d1-b7bb-989f365f5e5b\") " Sep 4 00:53:58.619589 kubelet[3260]: I0904 00:53:58.619539 3260 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b" (UID: "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 4 00:53:58.623382 kubelet[3260]: I0904 00:53:58.623324 3260 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b" (UID: "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 4 00:53:58.623541 kubelet[3260]: I0904 00:53:58.623448 3260 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-kube-api-access-d5z7s" (OuterVolumeSpecName: "kube-api-access-d5z7s") pod "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b" (UID: "7ce45a83-2ee7-41d1-b7bb-989f365f5e5b"). InnerVolumeSpecName "kube-api-access-d5z7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 4 00:53:58.720004 kubelet[3260]: I0904 00:53:58.719937 3260 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5z7s\" (UniqueName: \"kubernetes.io/projected/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-kube-api-access-d5z7s\") on node \"ci-4372.1.0-n-fd36784ab7\" DevicePath \"\"" Sep 4 00:53:58.720004 kubelet[3260]: I0904 00:53:58.720002 3260 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-ca-bundle\") on node \"ci-4372.1.0-n-fd36784ab7\" DevicePath \"\"" Sep 4 00:53:58.720380 kubelet[3260]: I0904 00:53:58.720031 3260 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b-whisker-backend-key-pair\") on node \"ci-4372.1.0-n-fd36784ab7\" DevicePath \"\"" Sep 4 00:53:59.315046 systemd[1]: var-lib-kubelet-pods-7ce45a83\x2d2ee7\x2d41d1\x2db7bb\x2d989f365f5e5b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dd5z7s.mount: Deactivated successfully. Sep 4 00:53:59.315104 systemd[1]: var-lib-kubelet-pods-7ce45a83\x2d2ee7\x2d41d1\x2db7bb\x2d989f365f5e5b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 00:53:59.419442 systemd[1]: Removed slice kubepods-besteffort-pod7ce45a83_2ee7_41d1_b7bb_989f365f5e5b.slice - libcontainer container kubepods-besteffort-pod7ce45a83_2ee7_41d1_b7bb_989f365f5e5b.slice. Sep 4 00:53:59.442928 systemd[1]: Created slice kubepods-besteffort-pod6982fe65_b22b_4b1d_8ce7_5d803771a916.slice - libcontainer container kubepods-besteffort-pod6982fe65_b22b_4b1d_8ce7_5d803771a916.slice. Sep 4 00:53:59.464300 containerd[1910]: time="2025-09-04T00:53:59.464275109Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"733efc8cc5602f5e310b97baa4c9f6ab9e49119c944b575cf52344842b670da9\" pid:4744 exit_status:1 exited_at:{seconds:1756947239 nanos:464116223}" Sep 4 00:53:59.626804 kubelet[3260]: I0904 00:53:59.626687 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwcj\" (UniqueName: \"kubernetes.io/projected/6982fe65-b22b-4b1d-8ce7-5d803771a916-kube-api-access-svwcj\") pod \"whisker-7bff6d4b9b-qgxww\" (UID: \"6982fe65-b22b-4b1d-8ce7-5d803771a916\") " pod="calico-system/whisker-7bff6d4b9b-qgxww" Sep 4 00:53:59.627589 kubelet[3260]: I0904 00:53:59.626866 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6982fe65-b22b-4b1d-8ce7-5d803771a916-whisker-ca-bundle\") pod \"whisker-7bff6d4b9b-qgxww\" (UID: \"6982fe65-b22b-4b1d-8ce7-5d803771a916\") " pod="calico-system/whisker-7bff6d4b9b-qgxww" Sep 4 00:53:59.627589 kubelet[3260]: I0904 00:53:59.626958 3260 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6982fe65-b22b-4b1d-8ce7-5d803771a916-whisker-backend-key-pair\") pod \"whisker-7bff6d4b9b-qgxww\" (UID: \"6982fe65-b22b-4b1d-8ce7-5d803771a916\") " pod="calico-system/whisker-7bff6d4b9b-qgxww" Sep 4 00:53:59.744622 containerd[1910]: time="2025-09-04T00:53:59.744571662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bff6d4b9b-qgxww,Uid:6982fe65-b22b-4b1d-8ce7-5d803771a916,Namespace:calico-system,Attempt:0,}" Sep 4 00:53:59.799729 systemd-networkd[1825]: cali9e0e2fd0442: Link UP Sep 4 00:53:59.799860 systemd-networkd[1825]: cali9e0e2fd0442: Gained carrier Sep 4 00:53:59.806446 containerd[1910]: 2025-09-04 00:53:59.761 [INFO][4939] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0 whisker-7bff6d4b9b- calico-system 6982fe65-b22b-4b1d-8ce7-5d803771a916 861 0 2025-09-04 00:53:59 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bff6d4b9b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 whisker-7bff6d4b9b-qgxww eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9e0e2fd0442 [] [] }} ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-" Sep 4 00:53:59.806446 containerd[1910]: 2025-09-04 00:53:59.761 [INFO][4939] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806446 containerd[1910]: 2025-09-04 00:53:59.774 [INFO][4960] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" HandleID="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Workload="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.774 [INFO][4960] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" HandleID="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Workload="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001397a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"whisker-7bff6d4b9b-qgxww", "timestamp":"2025-09-04 00:53:59.774119055 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.774 [INFO][4960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.774 [INFO][4960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.774 [INFO][4960] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.778 [INFO][4960] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.781 [INFO][4960] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.784 [INFO][4960] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.785 [INFO][4960] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806576 containerd[1910]: 2025-09-04 00:53:59.787 [INFO][4960] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.787 [INFO][4960] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.788 [INFO][4960] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635 Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.790 [INFO][4960] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.793 [INFO][4960] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.193/26] block=192.168.119.192/26 handle="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.793 [INFO][4960] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.193/26] handle="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.793 [INFO][4960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:53:59.806718 containerd[1910]: 2025-09-04 00:53:59.793 [INFO][4960] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.193/26] IPv6=[] ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" HandleID="k8s-pod-network.1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Workload="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806825 containerd[1910]: 2025-09-04 00:53:59.794 [INFO][4939] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0", GenerateName:"whisker-7bff6d4b9b-", Namespace:"calico-system", SelfLink:"", UID:"6982fe65-b22b-4b1d-8ce7-5d803771a916", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bff6d4b9b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"whisker-7bff6d4b9b-qgxww", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9e0e2fd0442", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:53:59.806825 containerd[1910]: 2025-09-04 00:53:59.795 [INFO][4939] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.193/32] ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806878 containerd[1910]: 2025-09-04 00:53:59.795 [INFO][4939] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e0e2fd0442 ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806878 containerd[1910]: 2025-09-04 00:53:59.799 [INFO][4939] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.806912 containerd[1910]: 2025-09-04 00:53:59.800 [INFO][4939] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0", GenerateName:"whisker-7bff6d4b9b-", Namespace:"calico-system", SelfLink:"", UID:"6982fe65-b22b-4b1d-8ce7-5d803771a916", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bff6d4b9b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635", Pod:"whisker-7bff6d4b9b-qgxww", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9e0e2fd0442", MAC:"92:91:98:fa:a8:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:53:59.806949 containerd[1910]: 2025-09-04 00:53:59.805 [INFO][4939] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" Namespace="calico-system" Pod="whisker-7bff6d4b9b-qgxww" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-whisker--7bff6d4b9b--qgxww-eth0" Sep 4 00:53:59.814715 containerd[1910]: time="2025-09-04T00:53:59.814689976Z" level=info msg="connecting to shim 1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635" address="unix:///run/containerd/s/dd47305304f998c945be78cf90b4d8e8ca1bca16c2b71337de816056ab7eb808" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:53:59.843290 systemd[1]: Started cri-containerd-1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635.scope - libcontainer container 1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635. Sep 4 00:53:59.869056 containerd[1910]: time="2025-09-04T00:53:59.869035319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bff6d4b9b-qgxww,Uid:6982fe65-b22b-4b1d-8ce7-5d803771a916,Namespace:calico-system,Attempt:0,} returns sandbox id \"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635\"" Sep 4 00:53:59.869690 containerd[1910]: time="2025-09-04T00:53:59.869677697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 00:53:59.891383 systemd-networkd[1825]: vxlan.calico: Link UP Sep 4 00:53:59.891386 systemd-networkd[1825]: vxlan.calico: Gained carrier Sep 4 00:54:00.299244 kubelet[3260]: I0904 00:54:00.299164 3260 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce45a83-2ee7-41d1-b7bb-989f365f5e5b" path="/var/lib/kubelet/pods/7ce45a83-2ee7-41d1-b7bb-989f365f5e5b/volumes" Sep 4 00:54:01.028507 systemd-networkd[1825]: vxlan.calico: Gained IPv6LL Sep 4 00:54:01.366589 containerd[1910]: time="2025-09-04T00:54:01.366534559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:01.366828 containerd[1910]: time="2025-09-04T00:54:01.366816213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 00:54:01.367200 containerd[1910]: time="2025-09-04T00:54:01.367190923Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:01.368075 containerd[1910]: time="2025-09-04T00:54:01.368066612Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:01.368497 containerd[1910]: time="2025-09-04T00:54:01.368484478Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.498791271s" Sep 4 00:54:01.368542 containerd[1910]: time="2025-09-04T00:54:01.368501811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 00:54:01.369390 containerd[1910]: time="2025-09-04T00:54:01.369378156Z" level=info msg="CreateContainer within sandbox \"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 00:54:01.371412 containerd[1910]: time="2025-09-04T00:54:01.371399102Z" level=info msg="Container d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:01.392455 containerd[1910]: time="2025-09-04T00:54:01.392403262Z" level=info msg="CreateContainer within sandbox \"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4\"" Sep 4 00:54:01.392707 containerd[1910]: time="2025-09-04T00:54:01.392643547Z" level=info msg="StartContainer for \"d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4\"" Sep 4 00:54:01.393333 containerd[1910]: time="2025-09-04T00:54:01.393288959Z" level=info msg="connecting to shim d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4" address="unix:///run/containerd/s/dd47305304f998c945be78cf90b4d8e8ca1bca16c2b71337de816056ab7eb808" protocol=ttrpc version=3 Sep 4 00:54:01.413400 systemd[1]: Started cri-containerd-d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4.scope - libcontainer container d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4. Sep 4 00:54:01.450583 containerd[1910]: time="2025-09-04T00:54:01.450519444Z" level=info msg="StartContainer for \"d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4\" returns successfully" Sep 4 00:54:01.451174 containerd[1910]: time="2025-09-04T00:54:01.451156151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 00:54:01.732613 systemd-networkd[1825]: cali9e0e2fd0442: Gained IPv6LL Sep 4 00:54:03.738312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2556576560.mount: Deactivated successfully. Sep 4 00:54:03.742937 containerd[1910]: time="2025-09-04T00:54:03.742888071Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:03.743084 containerd[1910]: time="2025-09-04T00:54:03.743056815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 00:54:03.743457 containerd[1910]: time="2025-09-04T00:54:03.743416944Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:03.744334 containerd[1910]: time="2025-09-04T00:54:03.744293678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:03.744713 containerd[1910]: time="2025-09-04T00:54:03.744673048Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.293495802s" Sep 4 00:54:03.744713 containerd[1910]: time="2025-09-04T00:54:03.744687794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 00:54:03.745608 containerd[1910]: time="2025-09-04T00:54:03.745597157Z" level=info msg="CreateContainer within sandbox \"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 00:54:03.748220 containerd[1910]: time="2025-09-04T00:54:03.748207022Z" level=info msg="Container a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:03.751710 containerd[1910]: time="2025-09-04T00:54:03.751669287Z" level=info msg="CreateContainer within sandbox \"1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f\"" Sep 4 00:54:03.751947 containerd[1910]: time="2025-09-04T00:54:03.751936472Z" level=info msg="StartContainer for \"a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f\"" Sep 4 00:54:03.752649 containerd[1910]: time="2025-09-04T00:54:03.752611857Z" level=info msg="connecting to shim a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f" address="unix:///run/containerd/s/dd47305304f998c945be78cf90b4d8e8ca1bca16c2b71337de816056ab7eb808" protocol=ttrpc version=3 Sep 4 00:54:03.776594 systemd[1]: Started cri-containerd-a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f.scope - libcontainer container a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f. Sep 4 00:54:03.871847 containerd[1910]: time="2025-09-04T00:54:03.871795768Z" level=info msg="StartContainer for \"a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f\" returns successfully" Sep 4 00:54:04.298580 containerd[1910]: time="2025-09-04T00:54:04.298503106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-p69xf,Uid:415fc8d0-9cb1-41e0-a4cc-84e4984d8f32,Namespace:kube-system,Attempt:0,}" Sep 4 00:54:04.298840 containerd[1910]: time="2025-09-04T00:54:04.298628269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-g7ckk,Uid:3a4824fe-ec50-448d-ad3a-edd8a6c76035,Namespace:kube-system,Attempt:0,}" Sep 4 00:54:04.356160 systemd-networkd[1825]: cali43134a46593: Link UP Sep 4 00:54:04.356377 systemd-networkd[1825]: cali43134a46593: Gained carrier Sep 4 00:54:04.361238 containerd[1910]: 2025-09-04 00:54:04.318 [INFO][5239] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0 coredns-7c65d6cfc9- kube-system 3a4824fe-ec50-448d-ad3a-edd8a6c76035 788 0 2025-09-04 00:53:33 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 coredns-7c65d6cfc9-g7ckk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali43134a46593 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-" Sep 4 00:54:04.361238 containerd[1910]: 2025-09-04 00:54:04.319 [INFO][5239] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361238 containerd[1910]: 2025-09-04 00:54:04.333 [INFO][5277] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" HandleID="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5277] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" HandleID="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e3850), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"coredns-7c65d6cfc9-g7ckk", "timestamp":"2025-09-04 00:54:04.333926671 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5277] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5277] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5277] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.339 [INFO][5277] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.342 [INFO][5277] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.345 [INFO][5277] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.347 [INFO][5277] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361367 containerd[1910]: 2025-09-04 00:54:04.348 [INFO][5277] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.348 [INFO][5277] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.349 [INFO][5277] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.351 [INFO][5277] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5277] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.194/26] block=192.168.119.192/26 handle="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5277] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.194/26] handle="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5277] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:04.361531 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5277] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.194/26] IPv6=[] ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" HandleID="k8s-pod-network.77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.355 [INFO][5239] cni-plugin/k8s.go 418: Populated endpoint ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3a4824fe-ec50-448d-ad3a-edd8a6c76035", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"coredns-7c65d6cfc9-g7ckk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43134a46593", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.355 [INFO][5239] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.194/32] ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.355 [INFO][5239] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43134a46593 ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.356 [INFO][5239] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.356 [INFO][5239] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3a4824fe-ec50-448d-ad3a-edd8a6c76035", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a", Pod:"coredns-7c65d6cfc9-g7ckk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43134a46593", MAC:"f2:a5:a5:30:4c:b6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:04.361645 containerd[1910]: 2025-09-04 00:54:04.360 [INFO][5239] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-g7ckk" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--g7ckk-eth0" Sep 4 00:54:04.368805 containerd[1910]: time="2025-09-04T00:54:04.368778072Z" level=info msg="connecting to shim 77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a" address="unix:///run/containerd/s/29bd751ff5863e97882dd86c622a0ce397aa9f9ec601a5d53f11f373eed9f34a" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:04.385282 systemd[1]: Started cri-containerd-77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a.scope - libcontainer container 77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a. Sep 4 00:54:04.414738 containerd[1910]: time="2025-09-04T00:54:04.414717984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-g7ckk,Uid:3a4824fe-ec50-448d-ad3a-edd8a6c76035,Namespace:kube-system,Attempt:0,} returns sandbox id \"77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a\"" Sep 4 00:54:04.415814 containerd[1910]: time="2025-09-04T00:54:04.415801556Z" level=info msg="CreateContainer within sandbox \"77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 00:54:04.418923 containerd[1910]: time="2025-09-04T00:54:04.418912080Z" level=info msg="Container 7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:04.420966 containerd[1910]: time="2025-09-04T00:54:04.420952700Z" level=info msg="CreateContainer within sandbox \"77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22\"" Sep 4 00:54:04.421221 containerd[1910]: time="2025-09-04T00:54:04.421166703Z" level=info msg="StartContainer for \"7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22\"" Sep 4 00:54:04.421725 containerd[1910]: time="2025-09-04T00:54:04.421685844Z" level=info msg="connecting to shim 7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22" address="unix:///run/containerd/s/29bd751ff5863e97882dd86c622a0ce397aa9f9ec601a5d53f11f373eed9f34a" protocol=ttrpc version=3 Sep 4 00:54:04.437573 kubelet[3260]: I0904 00:54:04.437544 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7bff6d4b9b-qgxww" podStartSLOduration=1.5620872399999999 podStartE2EDuration="5.43753293s" podCreationTimestamp="2025-09-04 00:53:59 +0000 UTC" firstStartedPulling="2025-09-04 00:53:59.8695734 +0000 UTC m=+31.663683536" lastFinishedPulling="2025-09-04 00:54:03.745019089 +0000 UTC m=+35.539129226" observedRunningTime="2025-09-04 00:54:04.437108476 +0000 UTC m=+36.231218613" watchObservedRunningTime="2025-09-04 00:54:04.43753293 +0000 UTC m=+36.231643064" Sep 4 00:54:04.439288 systemd[1]: Started cri-containerd-7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22.scope - libcontainer container 7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22. Sep 4 00:54:04.452775 containerd[1910]: time="2025-09-04T00:54:04.452753798Z" level=info msg="StartContainer for \"7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22\" returns successfully" Sep 4 00:54:04.455163 systemd-networkd[1825]: cali95b1cfcad4b: Link UP Sep 4 00:54:04.455338 systemd-networkd[1825]: cali95b1cfcad4b: Gained carrier Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.318 [INFO][5232] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0 coredns-7c65d6cfc9- kube-system 415fc8d0-9cb1-41e0-a4cc-84e4984d8f32 784 0 2025-09-04 00:53:33 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 coredns-7c65d6cfc9-p69xf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali95b1cfcad4b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.319 [INFO][5232] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5276] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" HandleID="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5276] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" HandleID="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d0250), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"coredns-7c65d6cfc9-p69xf", "timestamp":"2025-09-04 00:54:04.334388715 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.334 [INFO][5276] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5276] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.354 [INFO][5276] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.439 [INFO][5276] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.442 [INFO][5276] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.446 [INFO][5276] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.446 [INFO][5276] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.447 [INFO][5276] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.447 [INFO][5276] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.448 [INFO][5276] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88 Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.450 [INFO][5276] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.453 [INFO][5276] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.195/26] block=192.168.119.192/26 handle="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.453 [INFO][5276] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.195/26] handle="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.453 [INFO][5276] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:04.460974 containerd[1910]: 2025-09-04 00:54:04.453 [INFO][5276] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.195/26] IPv6=[] ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" HandleID="k8s-pod-network.db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Workload="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.454 [INFO][5232] cni-plugin/k8s.go 418: Populated endpoint ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"415fc8d0-9cb1-41e0-a4cc-84e4984d8f32", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"coredns-7c65d6cfc9-p69xf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali95b1cfcad4b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.454 [INFO][5232] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.195/32] ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.454 [INFO][5232] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali95b1cfcad4b ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.455 [INFO][5232] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.455 [INFO][5232] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"415fc8d0-9cb1-41e0-a4cc-84e4984d8f32", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88", Pod:"coredns-7c65d6cfc9-p69xf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali95b1cfcad4b", MAC:"3e:3f:ec:6c:aa:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:04.461573 containerd[1910]: 2025-09-04 00:54:04.460 [INFO][5232] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" Namespace="kube-system" Pod="coredns-7c65d6cfc9-p69xf" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-coredns--7c65d6cfc9--p69xf-eth0" Sep 4 00:54:04.468799 containerd[1910]: time="2025-09-04T00:54:04.468764374Z" level=info msg="connecting to shim db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88" address="unix:///run/containerd/s/fa926a9d1eb1f58e6e9c46bed30eba369455236924863bad4e3152f58d6a1768" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:04.490273 systemd[1]: Started cri-containerd-db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88.scope - libcontainer container db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88. Sep 4 00:54:04.516353 containerd[1910]: time="2025-09-04T00:54:04.516331394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-p69xf,Uid:415fc8d0-9cb1-41e0-a4cc-84e4984d8f32,Namespace:kube-system,Attempt:0,} returns sandbox id \"db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88\"" Sep 4 00:54:04.517401 containerd[1910]: time="2025-09-04T00:54:04.517389903Z" level=info msg="CreateContainer within sandbox \"db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 00:54:04.520680 containerd[1910]: time="2025-09-04T00:54:04.520665649Z" level=info msg="Container f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:04.522764 containerd[1910]: time="2025-09-04T00:54:04.522749448Z" level=info msg="CreateContainer within sandbox \"db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc\"" Sep 4 00:54:04.522928 containerd[1910]: time="2025-09-04T00:54:04.522916242Z" level=info msg="StartContainer for \"f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc\"" Sep 4 00:54:04.523325 containerd[1910]: time="2025-09-04T00:54:04.523313252Z" level=info msg="connecting to shim f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc" address="unix:///run/containerd/s/fa926a9d1eb1f58e6e9c46bed30eba369455236924863bad4e3152f58d6a1768" protocol=ttrpc version=3 Sep 4 00:54:04.547210 systemd[1]: Started cri-containerd-f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc.scope - libcontainer container f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc. Sep 4 00:54:04.563376 containerd[1910]: time="2025-09-04T00:54:04.563297962Z" level=info msg="StartContainer for \"f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc\" returns successfully" Sep 4 00:54:05.298060 containerd[1910]: time="2025-09-04T00:54:05.297960457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x6zjt,Uid:a488579f-53d3-4d0a-bc28-666d1db626ef,Namespace:calico-system,Attempt:0,}" Sep 4 00:54:05.298890 containerd[1910]: time="2025-09-04T00:54:05.298095172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89q69,Uid:df69562f-87c3-42fe-a794-4eb1c96d7d52,Namespace:calico-system,Attempt:0,}" Sep 4 00:54:05.353260 systemd-networkd[1825]: cali62fcd025f6f: Link UP Sep 4 00:54:05.353401 systemd-networkd[1825]: cali62fcd025f6f: Gained carrier Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.318 [INFO][5521] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0 goldmane-7988f88666- calico-system a488579f-53d3-4d0a-bc28-666d1db626ef 791 0 2025-09-04 00:53:42 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 goldmane-7988f88666-x6zjt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali62fcd025f6f [] [] }} ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.318 [INFO][5521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5566] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" HandleID="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Workload="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5566] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" HandleID="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Workload="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a57a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"goldmane-7988f88666-x6zjt", "timestamp":"2025-09-04 00:54:05.331799085 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5566] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5566] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5566] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.336 [INFO][5566] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.339 [INFO][5566] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.342 [INFO][5566] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.343 [INFO][5566] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.345 [INFO][5566] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.345 [INFO][5566] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.346 [INFO][5566] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242 Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.348 [INFO][5566] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5566] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.196/26] block=192.168.119.192/26 handle="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5566] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.196/26] handle="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5566] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:05.360394 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5566] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.196/26] IPv6=[] ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" HandleID="k8s-pod-network.e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Workload="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.352 [INFO][5521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"a488579f-53d3-4d0a-bc28-666d1db626ef", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"goldmane-7988f88666-x6zjt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali62fcd025f6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.352 [INFO][5521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.196/32] ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.352 [INFO][5521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62fcd025f6f ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.353 [INFO][5521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.353 [INFO][5521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"a488579f-53d3-4d0a-bc28-666d1db626ef", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242", Pod:"goldmane-7988f88666-x6zjt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali62fcd025f6f", MAC:"42:80:94:de:3c:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:05.360829 containerd[1910]: 2025-09-04 00:54:05.359 [INFO][5521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" Namespace="calico-system" Pod="goldmane-7988f88666-x6zjt" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-goldmane--7988f88666--x6zjt-eth0" Sep 4 00:54:05.368208 containerd[1910]: time="2025-09-04T00:54:05.368180260Z" level=info msg="connecting to shim e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242" address="unix:///run/containerd/s/dd806b8637d0e3a3a14694e0969dc2935240c953e10d3757eaaeab49e2e3621d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:05.394657 systemd[1]: Started cri-containerd-e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242.scope - libcontainer container e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242. Sep 4 00:54:05.456008 kubelet[3260]: I0904 00:54:05.455950 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-g7ckk" podStartSLOduration=32.455930157 podStartE2EDuration="32.455930157s" podCreationTimestamp="2025-09-04 00:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:54:05.455453211 +0000 UTC m=+37.249563375" watchObservedRunningTime="2025-09-04 00:54:05.455930157 +0000 UTC m=+37.250040305" Sep 4 00:54:05.464405 kubelet[3260]: I0904 00:54:05.464342 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-p69xf" podStartSLOduration=32.464322389 podStartE2EDuration="32.464322389s" podCreationTimestamp="2025-09-04 00:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 00:54:05.463850942 +0000 UTC m=+37.257961092" watchObservedRunningTime="2025-09-04 00:54:05.464322389 +0000 UTC m=+37.258432536" Sep 4 00:54:05.470532 systemd-networkd[1825]: cali5434d7b0000: Link UP Sep 4 00:54:05.470765 systemd-networkd[1825]: cali5434d7b0000: Gained carrier Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.318 [INFO][5527] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0 csi-node-driver- calico-system df69562f-87c3-42fe-a794-4eb1c96d7d52 671 0 2025-09-04 00:53:43 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 csi-node-driver-89q69 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5434d7b0000 [] [] }} ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.318 [INFO][5527] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.331 [INFO][5565] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" HandleID="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Workload="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.332 [INFO][5565] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" HandleID="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Workload="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e78a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"csi-node-driver-89q69", "timestamp":"2025-09-04 00:54:05.331985319 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.332 [INFO][5565] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5565] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.351 [INFO][5565] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.439 [INFO][5565] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.448 [INFO][5565] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.454 [INFO][5565] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.456 [INFO][5565] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.458 [INFO][5565] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.458 [INFO][5565] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.460 [INFO][5565] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.462 [INFO][5565] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.467 [INFO][5565] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.197/26] block=192.168.119.192/26 handle="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.467 [INFO][5565] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.197/26] handle="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.467 [INFO][5565] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:05.480182 containerd[1910]: 2025-09-04 00:54:05.467 [INFO][5565] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.197/26] IPv6=[] ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" HandleID="k8s-pod-network.98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Workload="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.469 [INFO][5527] cni-plugin/k8s.go 418: Populated endpoint ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"df69562f-87c3-42fe-a794-4eb1c96d7d52", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"csi-node-driver-89q69", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5434d7b0000", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.469 [INFO][5527] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.197/32] ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.469 [INFO][5527] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5434d7b0000 ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.470 [INFO][5527] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.471 [INFO][5527] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"df69562f-87c3-42fe-a794-4eb1c96d7d52", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f", Pod:"csi-node-driver-89q69", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5434d7b0000", MAC:"02:86:54:73:d7:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:05.480802 containerd[1910]: 2025-09-04 00:54:05.478 [INFO][5527] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" Namespace="calico-system" Pod="csi-node-driver-89q69" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-csi--node--driver--89q69-eth0" Sep 4 00:54:05.481032 containerd[1910]: time="2025-09-04T00:54:05.480871254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x6zjt,Uid:a488579f-53d3-4d0a-bc28-666d1db626ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242\"" Sep 4 00:54:05.481734 containerd[1910]: time="2025-09-04T00:54:05.481720913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 00:54:05.487729 containerd[1910]: time="2025-09-04T00:54:05.487702666Z" level=info msg="connecting to shim 98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f" address="unix:///run/containerd/s/51c23b062b65ae91d6cba595466224d9e02aa57b26508d343b29da194400d421" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:05.508380 systemd-networkd[1825]: cali43134a46593: Gained IPv6LL Sep 4 00:54:05.510389 systemd[1]: Started cri-containerd-98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f.scope - libcontainer container 98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f. Sep 4 00:54:05.522095 containerd[1910]: time="2025-09-04T00:54:05.522076299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-89q69,Uid:df69562f-87c3-42fe-a794-4eb1c96d7d52,Namespace:calico-system,Attempt:0,} returns sandbox id \"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f\"" Sep 4 00:54:06.277591 systemd-networkd[1825]: cali95b1cfcad4b: Gained IPv6LL Sep 4 00:54:06.298050 containerd[1910]: time="2025-09-04T00:54:06.297928398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-n6xq7,Uid:6b06bc34-c2ce-453b-b38a-15c95898d6db,Namespace:calico-apiserver,Attempt:0,}" Sep 4 00:54:06.355897 systemd-networkd[1825]: cali3ceb955780b: Link UP Sep 4 00:54:06.356135 systemd-networkd[1825]: cali3ceb955780b: Gained carrier Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.316 [INFO][5713] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0 calico-apiserver-6476566667- calico-apiserver 6b06bc34-c2ce-453b-b38a-15c95898d6db 793 0 2025-09-04 00:53:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6476566667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 calico-apiserver-6476566667-n6xq7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3ceb955780b [] [] }} ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.316 [INFO][5713] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.330 [INFO][5738] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" HandleID="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.330 [INFO][5738] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" HandleID="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f750), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"calico-apiserver-6476566667-n6xq7", "timestamp":"2025-09-04 00:54:06.330071107 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.330 [INFO][5738] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.330 [INFO][5738] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.330 [INFO][5738] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.335 [INFO][5738] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.338 [INFO][5738] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.341 [INFO][5738] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.343 [INFO][5738] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.344 [INFO][5738] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.344 [INFO][5738] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.346 [INFO][5738] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1 Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.349 [INFO][5738] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.353 [INFO][5738] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.198/26] block=192.168.119.192/26 handle="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.353 [INFO][5738] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.198/26] handle="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.353 [INFO][5738] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:06.364653 containerd[1910]: 2025-09-04 00:54:06.353 [INFO][5738] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.198/26] IPv6=[] ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" HandleID="k8s-pod-network.a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.354 [INFO][5713] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0", GenerateName:"calico-apiserver-6476566667-", Namespace:"calico-apiserver", SelfLink:"", UID:"6b06bc34-c2ce-453b-b38a-15c95898d6db", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6476566667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"calico-apiserver-6476566667-n6xq7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3ceb955780b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.354 [INFO][5713] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.198/32] ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.354 [INFO][5713] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ceb955780b ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.356 [INFO][5713] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.356 [INFO][5713] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0", GenerateName:"calico-apiserver-6476566667-", Namespace:"calico-apiserver", SelfLink:"", UID:"6b06bc34-c2ce-453b-b38a-15c95898d6db", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6476566667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1", Pod:"calico-apiserver-6476566667-n6xq7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3ceb955780b", MAC:"66:16:4d:81:a5:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:06.365584 containerd[1910]: 2025-09-04 00:54:06.362 [INFO][5713] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-n6xq7" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--n6xq7-eth0" Sep 4 00:54:06.373649 containerd[1910]: time="2025-09-04T00:54:06.373620891Z" level=info msg="connecting to shim a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1" address="unix:///run/containerd/s/ea6358db11e330410683c35ff69fd653820971662c55119700314d170069e242" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:06.402369 systemd[1]: Started cri-containerd-a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1.scope - libcontainer container a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1. Sep 4 00:54:06.433616 containerd[1910]: time="2025-09-04T00:54:06.433591589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-n6xq7,Uid:6b06bc34-c2ce-453b-b38a-15c95898d6db,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1\"" Sep 4 00:54:06.532419 systemd-networkd[1825]: cali5434d7b0000: Gained IPv6LL Sep 4 00:54:07.044411 systemd-networkd[1825]: cali62fcd025f6f: Gained IPv6LL Sep 4 00:54:07.297987 containerd[1910]: time="2025-09-04T00:54:07.297886903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-zmqjl,Uid:31036ac4-9f26-4932-a118-c6aacd0fb4f5,Namespace:calico-apiserver,Attempt:0,}" Sep 4 00:54:07.297987 containerd[1910]: time="2025-09-04T00:54:07.297886904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57898dccd4-4fmwz,Uid:15b7b5a8-04c0-4edb-9d50-0a666c7b0278,Namespace:calico-system,Attempt:0,}" Sep 4 00:54:07.352528 systemd-networkd[1825]: cali85ca83fd888: Link UP Sep 4 00:54:07.352692 systemd-networkd[1825]: cali85ca83fd888: Gained carrier Sep 4 00:54:07.364971 systemd-networkd[1825]: cali3ceb955780b: Gained IPv6LL Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.316 [INFO][5812] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0 calico-apiserver-6476566667- calico-apiserver 31036ac4-9f26-4932-a118-c6aacd0fb4f5 790 0 2025-09-04 00:53:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6476566667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 calico-apiserver-6476566667-zmqjl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali85ca83fd888 [] [] }} ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.316 [INFO][5812] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5859] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" HandleID="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5859] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" HandleID="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fec0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"calico-apiserver-6476566667-zmqjl", "timestamp":"2025-09-04 00:54:07.328513485 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5859] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5859] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5859] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.333 [INFO][5859] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.335 [INFO][5859] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.338 [INFO][5859] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.339 [INFO][5859] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.340 [INFO][5859] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.340 [INFO][5859] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.340 [INFO][5859] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.347 [INFO][5859] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5859] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.199/26] block=192.168.119.192/26 handle="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5859] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.199/26] handle="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5859] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:07.367386 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5859] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.199/26] IPv6=[] ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" HandleID="k8s-pod-network.23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.351 [INFO][5812] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0", GenerateName:"calico-apiserver-6476566667-", Namespace:"calico-apiserver", SelfLink:"", UID:"31036ac4-9f26-4932-a118-c6aacd0fb4f5", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6476566667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"calico-apiserver-6476566667-zmqjl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85ca83fd888", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.351 [INFO][5812] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.199/32] ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.351 [INFO][5812] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85ca83fd888 ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.352 [INFO][5812] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.353 [INFO][5812] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0", GenerateName:"calico-apiserver-6476566667-", Namespace:"calico-apiserver", SelfLink:"", UID:"31036ac4-9f26-4932-a118-c6aacd0fb4f5", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6476566667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c", Pod:"calico-apiserver-6476566667-zmqjl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali85ca83fd888", MAC:"7a:cd:db:0b:c4:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:07.367891 containerd[1910]: 2025-09-04 00:54:07.358 [INFO][5812] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" Namespace="calico-apiserver" Pod="calico-apiserver-6476566667-zmqjl" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--apiserver--6476566667--zmqjl-eth0" Sep 4 00:54:07.374795 containerd[1910]: time="2025-09-04T00:54:07.374771356Z" level=info msg="connecting to shim 23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c" address="unix:///run/containerd/s/a2b2ee3324e2b8aa4789cedda6a2e6f8e49129c2505c9be298ebccbdc48bb48f" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:07.393237 systemd[1]: Started cri-containerd-23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c.scope - libcontainer container 23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c. Sep 4 00:54:07.419653 containerd[1910]: time="2025-09-04T00:54:07.419607891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6476566667-zmqjl,Uid:31036ac4-9f26-4932-a118-c6aacd0fb4f5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c\"" Sep 4 00:54:07.448700 systemd-networkd[1825]: cali2f372358ffc: Link UP Sep 4 00:54:07.448885 systemd-networkd[1825]: cali2f372358ffc: Gained carrier Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.316 [INFO][5819] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0 calico-kube-controllers-57898dccd4- calico-system 15b7b5a8-04c0-4edb-9d50-0a666c7b0278 792 0 2025-09-04 00:53:43 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:57898dccd4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-n-fd36784ab7 calico-kube-controllers-57898dccd4-4fmwz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2f372358ffc [] [] }} ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.316 [INFO][5819] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5861] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" HandleID="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5861] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" HandleID="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-fd36784ab7", "pod":"calico-kube-controllers-57898dccd4-4fmwz", "timestamp":"2025-09-04 00:54:07.328513071 +0000 UTC"}, Hostname:"ci-4372.1.0-n-fd36784ab7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.328 [INFO][5861] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5861] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.350 [INFO][5861] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-fd36784ab7' Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.433 [INFO][5861] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.435 [INFO][5861] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.438 [INFO][5861] ipam/ipam.go 511: Trying affinity for 192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.439 [INFO][5861] ipam/ipam.go 158: Attempting to load block cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.440 [INFO][5861] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.119.192/26 host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.440 [INFO][5861] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.119.192/26 handle="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.441 [INFO][5861] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.443 [INFO][5861] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.119.192/26 handle="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.446 [INFO][5861] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.119.200/26] block=192.168.119.192/26 handle="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.446 [INFO][5861] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.119.200/26] handle="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" host="ci-4372.1.0-n-fd36784ab7" Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.446 [INFO][5861] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 00:54:07.454628 containerd[1910]: 2025-09-04 00:54:07.446 [INFO][5861] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.200/26] IPv6=[] ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" HandleID="k8s-pod-network.b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Workload="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.447 [INFO][5819] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0", GenerateName:"calico-kube-controllers-57898dccd4-", Namespace:"calico-system", SelfLink:"", UID:"15b7b5a8-04c0-4edb-9d50-0a666c7b0278", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57898dccd4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"", Pod:"calico-kube-controllers-57898dccd4-4fmwz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2f372358ffc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.447 [INFO][5819] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.200/32] ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.447 [INFO][5819] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2f372358ffc ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.448 [INFO][5819] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.449 [INFO][5819] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0", GenerateName:"calico-kube-controllers-57898dccd4-", Namespace:"calico-system", SelfLink:"", UID:"15b7b5a8-04c0-4edb-9d50-0a666c7b0278", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 0, 53, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57898dccd4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-fd36784ab7", ContainerID:"b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb", Pod:"calico-kube-controllers-57898dccd4-4fmwz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2f372358ffc", MAC:"b6:2f:99:6e:cf:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 00:54:07.455027 containerd[1910]: 2025-09-04 00:54:07.453 [INFO][5819] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" Namespace="calico-system" Pod="calico-kube-controllers-57898dccd4-4fmwz" WorkloadEndpoint="ci--4372.1.0--n--fd36784ab7-k8s-calico--kube--controllers--57898dccd4--4fmwz-eth0" Sep 4 00:54:07.462620 containerd[1910]: time="2025-09-04T00:54:07.462595089Z" level=info msg="connecting to shim b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb" address="unix:///run/containerd/s/7ea0de701b85ae300cf55a58ff1c1c42e6844069f74a93a7994fd49d4c7b9ab0" namespace=k8s.io protocol=ttrpc version=3 Sep 4 00:54:07.482396 systemd[1]: Started cri-containerd-b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb.scope - libcontainer container b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb. Sep 4 00:54:07.569938 containerd[1910]: time="2025-09-04T00:54:07.569666097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57898dccd4-4fmwz,Uid:15b7b5a8-04c0-4edb-9d50-0a666c7b0278,Namespace:calico-system,Attempt:0,} returns sandbox id \"b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb\"" Sep 4 00:54:08.008953 containerd[1910]: time="2025-09-04T00:54:08.008899914Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:08.009137 containerd[1910]: time="2025-09-04T00:54:08.009121363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 00:54:08.009414 containerd[1910]: time="2025-09-04T00:54:08.009399979Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:08.010348 containerd[1910]: time="2025-09-04T00:54:08.010333697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:08.010804 containerd[1910]: time="2025-09-04T00:54:08.010758796Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.529021126s" Sep 4 00:54:08.010804 containerd[1910]: time="2025-09-04T00:54:08.010776424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 00:54:08.011244 containerd[1910]: time="2025-09-04T00:54:08.011209987Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 00:54:08.011784 containerd[1910]: time="2025-09-04T00:54:08.011769341Z" level=info msg="CreateContainer within sandbox \"e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 00:54:08.014343 containerd[1910]: time="2025-09-04T00:54:08.014302026Z" level=info msg="Container 2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:08.016913 containerd[1910]: time="2025-09-04T00:54:08.016899261Z" level=info msg="CreateContainer within sandbox \"e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\"" Sep 4 00:54:08.017089 containerd[1910]: time="2025-09-04T00:54:08.017077211Z" level=info msg="StartContainer for \"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\"" Sep 4 00:54:08.017585 containerd[1910]: time="2025-09-04T00:54:08.017571574Z" level=info msg="connecting to shim 2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c" address="unix:///run/containerd/s/dd806b8637d0e3a3a14694e0969dc2935240c953e10d3757eaaeab49e2e3621d" protocol=ttrpc version=3 Sep 4 00:54:08.040425 systemd[1]: Started cri-containerd-2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c.scope - libcontainer container 2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c. Sep 4 00:54:08.071250 containerd[1910]: time="2025-09-04T00:54:08.071225206Z" level=info msg="StartContainer for \"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" returns successfully" Sep 4 00:54:08.458483 kubelet[3260]: I0904 00:54:08.458428 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-x6zjt" podStartSLOduration=23.928847926 podStartE2EDuration="26.45841468s" podCreationTimestamp="2025-09-04 00:53:42 +0000 UTC" firstStartedPulling="2025-09-04 00:54:05.481594181 +0000 UTC m=+37.275704321" lastFinishedPulling="2025-09-04 00:54:08.011160939 +0000 UTC m=+39.805271075" observedRunningTime="2025-09-04 00:54:08.458269136 +0000 UTC m=+40.252379286" watchObservedRunningTime="2025-09-04 00:54:08.45841468 +0000 UTC m=+40.252524824" Sep 4 00:54:08.772433 systemd-networkd[1825]: cali2f372358ffc: Gained IPv6LL Sep 4 00:54:09.221502 systemd-networkd[1825]: cali85ca83fd888: Gained IPv6LL Sep 4 00:54:09.503510 containerd[1910]: time="2025-09-04T00:54:09.503446662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"5de918c2974f93c273f6865207fa5c248e598b618055b0ba956c0fdc585762ae\" pid:6079 exit_status:1 exited_at:{seconds:1756947249 nanos:503233244}" Sep 4 00:54:09.512393 containerd[1910]: time="2025-09-04T00:54:09.512350938Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:09.512624 containerd[1910]: time="2025-09-04T00:54:09.512587678Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 00:54:09.513018 containerd[1910]: time="2025-09-04T00:54:09.512984712Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:09.513970 containerd[1910]: time="2025-09-04T00:54:09.513937005Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:09.514388 containerd[1910]: time="2025-09-04T00:54:09.514344909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.503117484s" Sep 4 00:54:09.514388 containerd[1910]: time="2025-09-04T00:54:09.514381465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 00:54:09.514859 containerd[1910]: time="2025-09-04T00:54:09.514847718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 00:54:09.515334 containerd[1910]: time="2025-09-04T00:54:09.515321683Z" level=info msg="CreateContainer within sandbox \"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 00:54:09.519435 containerd[1910]: time="2025-09-04T00:54:09.519389209Z" level=info msg="Container c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:09.522732 containerd[1910]: time="2025-09-04T00:54:09.522695673Z" level=info msg="CreateContainer within sandbox \"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389\"" Sep 4 00:54:09.523011 containerd[1910]: time="2025-09-04T00:54:09.522997608Z" level=info msg="StartContainer for \"c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389\"" Sep 4 00:54:09.523885 containerd[1910]: time="2025-09-04T00:54:09.523845415Z" level=info msg="connecting to shim c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389" address="unix:///run/containerd/s/51c23b062b65ae91d6cba595466224d9e02aa57b26508d343b29da194400d421" protocol=ttrpc version=3 Sep 4 00:54:09.541376 systemd[1]: Started cri-containerd-c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389.scope - libcontainer container c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389. Sep 4 00:54:09.562188 containerd[1910]: time="2025-09-04T00:54:09.562158952Z" level=info msg="StartContainer for \"c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389\" returns successfully" Sep 4 00:54:10.532023 containerd[1910]: time="2025-09-04T00:54:10.531966694Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"ef99265d4cd5b0051378c3fa7c32f56a61eeab52fc606fd4c3a6e30f20e8801e\" pid:6148 exit_status:1 exited_at:{seconds:1756947250 nanos:531782939}" Sep 4 00:54:12.011572 containerd[1910]: time="2025-09-04T00:54:12.011520160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:12.011781 containerd[1910]: time="2025-09-04T00:54:12.011753755Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 00:54:12.012163 containerd[1910]: time="2025-09-04T00:54:12.012105488Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:12.012973 containerd[1910]: time="2025-09-04T00:54:12.012932540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:12.013384 containerd[1910]: time="2025-09-04T00:54:12.013344385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.498482382s" Sep 4 00:54:12.013384 containerd[1910]: time="2025-09-04T00:54:12.013359194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 00:54:12.013810 containerd[1910]: time="2025-09-04T00:54:12.013771964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 00:54:12.014260 containerd[1910]: time="2025-09-04T00:54:12.014219718Z" level=info msg="CreateContainer within sandbox \"a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 00:54:12.016631 containerd[1910]: time="2025-09-04T00:54:12.016619562Z" level=info msg="Container 0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:12.019168 containerd[1910]: time="2025-09-04T00:54:12.019107807Z" level=info msg="CreateContainer within sandbox \"a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f\"" Sep 4 00:54:12.019355 containerd[1910]: time="2025-09-04T00:54:12.019300717Z" level=info msg="StartContainer for \"0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f\"" Sep 4 00:54:12.019823 containerd[1910]: time="2025-09-04T00:54:12.019782785Z" level=info msg="connecting to shim 0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f" address="unix:///run/containerd/s/ea6358db11e330410683c35ff69fd653820971662c55119700314d170069e242" protocol=ttrpc version=3 Sep 4 00:54:12.042369 systemd[1]: Started cri-containerd-0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f.scope - libcontainer container 0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f. Sep 4 00:54:12.080531 containerd[1910]: time="2025-09-04T00:54:12.080506335Z" level=info msg="StartContainer for \"0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f\" returns successfully" Sep 4 00:54:12.464096 kubelet[3260]: I0904 00:54:12.464059 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6476566667-n6xq7" podStartSLOduration=26.884450357 podStartE2EDuration="32.464047211s" podCreationTimestamp="2025-09-04 00:53:40 +0000 UTC" firstStartedPulling="2025-09-04 00:54:06.434131578 +0000 UTC m=+38.228241714" lastFinishedPulling="2025-09-04 00:54:12.013728433 +0000 UTC m=+43.807838568" observedRunningTime="2025-09-04 00:54:12.463723 +0000 UTC m=+44.257833141" watchObservedRunningTime="2025-09-04 00:54:12.464047211 +0000 UTC m=+44.258157346" Sep 4 00:54:12.483314 containerd[1910]: time="2025-09-04T00:54:12.483292640Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:12.483476 containerd[1910]: time="2025-09-04T00:54:12.483461912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 00:54:12.484530 containerd[1910]: time="2025-09-04T00:54:12.484498839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 470.713491ms" Sep 4 00:54:12.484530 containerd[1910]: time="2025-09-04T00:54:12.484514175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 00:54:12.485020 containerd[1910]: time="2025-09-04T00:54:12.485008298Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 00:54:12.485675 containerd[1910]: time="2025-09-04T00:54:12.485639886Z" level=info msg="CreateContainer within sandbox \"23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 00:54:12.488497 containerd[1910]: time="2025-09-04T00:54:12.488456998Z" level=info msg="Container f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:12.491333 containerd[1910]: time="2025-09-04T00:54:12.491298375Z" level=info msg="CreateContainer within sandbox \"23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2\"" Sep 4 00:54:12.491629 containerd[1910]: time="2025-09-04T00:54:12.491573508Z" level=info msg="StartContainer for \"f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2\"" Sep 4 00:54:12.492181 containerd[1910]: time="2025-09-04T00:54:12.492164222Z" level=info msg="connecting to shim f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2" address="unix:///run/containerd/s/a2b2ee3324e2b8aa4789cedda6a2e6f8e49129c2505c9be298ebccbdc48bb48f" protocol=ttrpc version=3 Sep 4 00:54:12.519304 systemd[1]: Started cri-containerd-f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2.scope - libcontainer container f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2. Sep 4 00:54:12.547701 containerd[1910]: time="2025-09-04T00:54:12.547675090Z" level=info msg="StartContainer for \"f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2\" returns successfully" Sep 4 00:54:13.459226 kubelet[3260]: I0904 00:54:13.459211 3260 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 00:54:13.466363 kubelet[3260]: I0904 00:54:13.466331 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6476566667-zmqjl" podStartSLOduration=28.401526934 podStartE2EDuration="33.466319585s" podCreationTimestamp="2025-09-04 00:53:40 +0000 UTC" firstStartedPulling="2025-09-04 00:54:07.420153503 +0000 UTC m=+39.214263640" lastFinishedPulling="2025-09-04 00:54:12.484946153 +0000 UTC m=+44.279056291" observedRunningTime="2025-09-04 00:54:13.465874614 +0000 UTC m=+45.259984752" watchObservedRunningTime="2025-09-04 00:54:13.466319585 +0000 UTC m=+45.260429721" Sep 4 00:54:14.461093 kubelet[3260]: I0904 00:54:14.461058 3260 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 00:54:15.085902 containerd[1910]: time="2025-09-04T00:54:15.085874261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:15.086181 containerd[1910]: time="2025-09-04T00:54:15.086067411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 00:54:15.086448 containerd[1910]: time="2025-09-04T00:54:15.086434771Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:15.087248 containerd[1910]: time="2025-09-04T00:54:15.087236095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:15.087608 containerd[1910]: time="2025-09-04T00:54:15.087596148Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.602573824s" Sep 4 00:54:15.087632 containerd[1910]: time="2025-09-04T00:54:15.087611729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 00:54:15.088046 containerd[1910]: time="2025-09-04T00:54:15.088033451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 00:54:15.091228 containerd[1910]: time="2025-09-04T00:54:15.091210640Z" level=info msg="CreateContainer within sandbox \"b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 00:54:15.093962 containerd[1910]: time="2025-09-04T00:54:15.093946132Z" level=info msg="Container 0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:15.096579 containerd[1910]: time="2025-09-04T00:54:15.096565486Z" level=info msg="CreateContainer within sandbox \"b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\"" Sep 4 00:54:15.096822 containerd[1910]: time="2025-09-04T00:54:15.096812799Z" level=info msg="StartContainer for \"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\"" Sep 4 00:54:15.097546 containerd[1910]: time="2025-09-04T00:54:15.097532795Z" level=info msg="connecting to shim 0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80" address="unix:///run/containerd/s/7ea0de701b85ae300cf55a58ff1c1c42e6844069f74a93a7994fd49d4c7b9ab0" protocol=ttrpc version=3 Sep 4 00:54:15.120310 systemd[1]: Started cri-containerd-0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80.scope - libcontainer container 0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80. Sep 4 00:54:15.149705 containerd[1910]: time="2025-09-04T00:54:15.149653417Z" level=info msg="StartContainer for \"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" returns successfully" Sep 4 00:54:15.488354 kubelet[3260]: I0904 00:54:15.488290 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-57898dccd4-4fmwz" podStartSLOduration=24.970527414 podStartE2EDuration="32.488277362s" podCreationTimestamp="2025-09-04 00:53:43 +0000 UTC" firstStartedPulling="2025-09-04 00:54:07.57023145 +0000 UTC m=+39.364341586" lastFinishedPulling="2025-09-04 00:54:15.087981397 +0000 UTC m=+46.882091534" observedRunningTime="2025-09-04 00:54:15.488182094 +0000 UTC m=+47.282292231" watchObservedRunningTime="2025-09-04 00:54:15.488277362 +0000 UTC m=+47.282387497" Sep 4 00:54:15.531183 containerd[1910]: time="2025-09-04T00:54:15.531154622Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"386f9ba02f0e63b212faca43d99fde3d2dc01b7bb18fffa1f6e012c40b697595\" pid:6350 exited_at:{seconds:1756947255 nanos:530970618}" Sep 4 00:54:16.772290 kubelet[3260]: I0904 00:54:16.772240 3260 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 00:54:16.871549 containerd[1910]: time="2025-09-04T00:54:16.871519488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:16.871776 containerd[1910]: time="2025-09-04T00:54:16.871752242Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 00:54:16.872058 containerd[1910]: time="2025-09-04T00:54:16.872046465Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:16.872846 containerd[1910]: time="2025-09-04T00:54:16.872833240Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 00:54:16.873243 containerd[1910]: time="2025-09-04T00:54:16.873231764Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.785181892s" Sep 4 00:54:16.873267 containerd[1910]: time="2025-09-04T00:54:16.873248452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 00:54:16.874326 containerd[1910]: time="2025-09-04T00:54:16.874285756Z" level=info msg="CreateContainer within sandbox \"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 00:54:16.877104 containerd[1910]: time="2025-09-04T00:54:16.877061709Z" level=info msg="Container 975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8: CDI devices from CRI Config.CDIDevices: []" Sep 4 00:54:16.882184 containerd[1910]: time="2025-09-04T00:54:16.882157405Z" level=info msg="CreateContainer within sandbox \"98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8\"" Sep 4 00:54:16.882428 containerd[1910]: time="2025-09-04T00:54:16.882386118Z" level=info msg="StartContainer for \"975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8\"" Sep 4 00:54:16.883149 containerd[1910]: time="2025-09-04T00:54:16.883104095Z" level=info msg="connecting to shim 975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8" address="unix:///run/containerd/s/51c23b062b65ae91d6cba595466224d9e02aa57b26508d343b29da194400d421" protocol=ttrpc version=3 Sep 4 00:54:16.899413 systemd[1]: Started cri-containerd-975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8.scope - libcontainer container 975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8. Sep 4 00:54:16.917704 containerd[1910]: time="2025-09-04T00:54:16.917683282Z" level=info msg="StartContainer for \"975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8\" returns successfully" Sep 4 00:54:17.351810 kubelet[3260]: I0904 00:54:17.351755 3260 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 00:54:17.352091 kubelet[3260]: I0904 00:54:17.351845 3260 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 00:54:17.505591 kubelet[3260]: I0904 00:54:17.505398 3260 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-89q69" podStartSLOduration=23.154301149 podStartE2EDuration="34.505359434s" podCreationTimestamp="2025-09-04 00:53:43 +0000 UTC" firstStartedPulling="2025-09-04 00:54:05.522586705 +0000 UTC m=+37.316696842" lastFinishedPulling="2025-09-04 00:54:16.873644989 +0000 UTC m=+48.667755127" observedRunningTime="2025-09-04 00:54:17.504144399 +0000 UTC m=+49.298254629" watchObservedRunningTime="2025-09-04 00:54:17.505359434 +0000 UTC m=+49.299469625" Sep 4 00:54:17.964488 containerd[1910]: time="2025-09-04T00:54:17.964463048Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"db563cb692b785dcd08e5e5a49c998b0d970d5bb5500ecd92838a4645296a594\" pid:6419 exited_at:{seconds:1756947257 nanos:964215846}" Sep 4 00:54:31.156675 containerd[1910]: time="2025-09-04T00:54:31.156616830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"e865dea8d6e7c0dc26bd2d3ea61a796dd3cccff64dc7f700fc12cc51f5d9e89d\" pid:6470 exited_at:{seconds:1756947271 nanos:156450241}" Sep 4 00:54:34.980167 containerd[1910]: time="2025-09-04T00:54:34.980136546Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"9ada98be64e74df3d861026a1bdd6229f121b9d45c2f1cf80f0471c5c1fce52b\" pid:6496 exited_at:{seconds:1756947274 nanos:979993196}" Sep 4 00:54:35.967330 containerd[1910]: time="2025-09-04T00:54:35.967297327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"103e30c35e8d6c542b48af9b072632796ac64dd33496bdb7b694eeea95449bf7\" pid:6518 exited_at:{seconds:1756947275 nanos:967078656}" Sep 4 00:54:40.633570 systemd[1]: Started sshd@10-147.28.180.77:22-43.156.45.213:56844.service - OpenSSH per-connection server daemon (43.156.45.213:56844). Sep 4 00:54:41.812321 sshd[6550]: Received disconnect from 43.156.45.213 port 56844:11: Bye Bye [preauth] Sep 4 00:54:41.812321 sshd[6550]: Disconnected from authenticating user adm 43.156.45.213 port 56844 [preauth] Sep 4 00:54:41.815834 systemd[1]: sshd@10-147.28.180.77:22-43.156.45.213:56844.service: Deactivated successfully. Sep 4 00:54:48.011140 containerd[1910]: time="2025-09-04T00:54:48.011107112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"2a003dd707ca0794c36bcba6e44e58ae3114235969f7c3b8f665b06fac8e7d93\" pid:6568 exited_at:{seconds:1756947288 nanos:10925292}" Sep 4 00:54:51.106684 containerd[1910]: time="2025-09-04T00:54:51.106629739Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"d237a454c28e2ea9cbef9aa7bc9dfc37f5e26e23478ac7da6a9fe7209e080c6e\" pid:6603 exited_at:{seconds:1756947291 nanos:106476302}" Sep 4 00:55:01.183106 containerd[1910]: time="2025-09-04T00:55:01.183075719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"1c69ccbbb5e36b85a4aee066705ac2faf3dd9e3978c6ad4a323a18afd6bd8085\" pid:6637 exited_at:{seconds:1756947301 nanos:182934969}" Sep 4 00:55:02.638930 systemd[1]: sshd@3-147.28.180.77:22-183.23.62.16:30582.service: Deactivated successfully. Sep 4 00:55:05.959767 containerd[1910]: time="2025-09-04T00:55:05.959741496Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"e62bdf4c1552fe7b7c7f2029b2044ba6b06a4c9d111b35afe7d8dbb686bbef22\" pid:6663 exited_at:{seconds:1756947305 nanos:959485977}" Sep 4 00:55:11.926950 systemd[1]: Started sshd@11-147.28.180.77:22-183.23.62.16:41625.service - OpenSSH per-connection server daemon (183.23.62.16:41625). Sep 4 00:55:18.010797 containerd[1910]: time="2025-09-04T00:55:18.010764541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"aaed5dd295f2007d861b4991eb34dfcf4c3e3fde5fad269f1b6cd848093c4f3c\" pid:6698 exited_at:{seconds:1756947318 nanos:10486023}" Sep 4 00:55:31.133156 containerd[1910]: time="2025-09-04T00:55:31.133126892Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"fdd06ffd358a366dce9ba1b6a04dc43e0e10eb977c6c4d8d194424ebabaa97ce\" pid:6746 exited_at:{seconds:1756947331 nanos:132976180}" Sep 4 00:55:34.986747 containerd[1910]: time="2025-09-04T00:55:34.986715687Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"e196def39889407bd9513f5b409ac29bfcaea859ae5932d8bdabbb64202efa1b\" pid:6796 exited_at:{seconds:1756947334 nanos:986584607}" Sep 4 00:55:36.011808 containerd[1910]: time="2025-09-04T00:55:36.011782433Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"4125251a8c8749ce9d009c5fe603feacd84a654e6b8f6090022ee71037653f75\" pid:6817 exited_at:{seconds:1756947336 nanos:11579770}" Sep 4 00:55:48.009830 containerd[1910]: time="2025-09-04T00:55:48.009796548Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"0e70d3c0ee17349250f4af08b55bdcddc3a8e5f58d848b8df717818136aa8700\" pid:6850 exited_at:{seconds:1756947348 nanos:9552066}" Sep 4 00:55:51.050927 containerd[1910]: time="2025-09-04T00:55:51.050902007Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"53fae129411ce268332a900d4317f9b790a7fb3e02620267fc5e8207882cddb2\" pid:6885 exited_at:{seconds:1756947351 nanos:50741697}" Sep 4 00:55:55.987541 update_engine[1903]: I20250904 00:55:55.987318 1903 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 4 00:55:55.987541 update_engine[1903]: I20250904 00:55:55.987413 1903 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 4 00:55:55.988737 update_engine[1903]: I20250904 00:55:55.987768 1903 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 4 00:55:55.988858 update_engine[1903]: I20250904 00:55:55.988749 1903 omaha_request_params.cc:62] Current group set to beta Sep 4 00:55:55.989069 update_engine[1903]: I20250904 00:55:55.988977 1903 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 4 00:55:55.989069 update_engine[1903]: I20250904 00:55:55.989012 1903 update_attempter.cc:643] Scheduling an action processor start. Sep 4 00:55:55.989069 update_engine[1903]: I20250904 00:55:55.989048 1903 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 4 00:55:55.989454 update_engine[1903]: I20250904 00:55:55.989177 1903 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 4 00:55:55.989454 update_engine[1903]: I20250904 00:55:55.989338 1903 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 4 00:55:55.989454 update_engine[1903]: I20250904 00:55:55.989365 1903 omaha_request_action.cc:272] Request: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: Sep 4 00:55:55.989454 update_engine[1903]: I20250904 00:55:55.989380 1903 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 00:55:55.990429 locksmithd[1986]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 4 00:55:55.993066 update_engine[1903]: I20250904 00:55:55.992976 1903 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 00:55:55.994000 update_engine[1903]: I20250904 00:55:55.993832 1903 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 00:55:55.994438 update_engine[1903]: E20250904 00:55:55.994341 1903 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 00:55:55.994622 update_engine[1903]: I20250904 00:55:55.994516 1903 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 4 00:55:57.507794 systemd[1]: Started sshd@12-147.28.180.77:22-159.203.2.69:59250.service - OpenSSH per-connection server daemon (159.203.2.69:59250). Sep 4 00:55:57.916466 sshd[6908]: Invalid user chaichang from 159.203.2.69 port 59250 Sep 4 00:55:57.986645 sshd[6908]: Received disconnect from 159.203.2.69 port 59250:11: Bye Bye [preauth] Sep 4 00:55:57.986645 sshd[6908]: Disconnected from invalid user chaichang 159.203.2.69 port 59250 [preauth] Sep 4 00:55:57.991452 systemd[1]: sshd@12-147.28.180.77:22-159.203.2.69:59250.service: Deactivated successfully. Sep 4 00:56:01.126592 containerd[1910]: time="2025-09-04T00:56:01.126569062Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"26457214846376d73c174db22551bae49a25727dfd4a8573a8c300171119172f\" pid:6924 exited_at:{seconds:1756947361 nanos:126444174}" Sep 4 00:56:04.434969 systemd[1]: Started sshd@13-147.28.180.77:22-103.10.45.57:55568.service - OpenSSH per-connection server daemon (103.10.45.57:55568). Sep 4 00:56:05.511600 sshd[6935]: Invalid user tomcat from 103.10.45.57 port 55568 Sep 4 00:56:05.718263 sshd[6935]: Received disconnect from 103.10.45.57 port 55568:11: Bye Bye [preauth] Sep 4 00:56:05.718263 sshd[6935]: Disconnected from invalid user tomcat 103.10.45.57 port 55568 [preauth] Sep 4 00:56:05.723207 systemd[1]: sshd@13-147.28.180.77:22-103.10.45.57:55568.service: Deactivated successfully. Sep 4 00:56:05.928988 update_engine[1903]: I20250904 00:56:05.928927 1903 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 00:56:05.929243 update_engine[1903]: I20250904 00:56:05.929039 1903 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 00:56:05.929243 update_engine[1903]: I20250904 00:56:05.929232 1903 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 00:56:05.929581 update_engine[1903]: E20250904 00:56:05.929556 1903 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 00:56:05.929612 update_engine[1903]: I20250904 00:56:05.929589 1903 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 4 00:56:05.954952 containerd[1910]: time="2025-09-04T00:56:05.954926967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"a4200881dfb78fb7b0276aaea6abc8b245b7b31c8513bdf430f8690cda7f6389\" pid:6954 exited_at:{seconds:1756947365 nanos:954687809}" Sep 4 00:56:12.831167 systemd[1]: Started sshd@14-147.28.180.77:22-183.23.62.16:45972.service - OpenSSH per-connection server daemon (183.23.62.16:45972). Sep 4 00:56:15.924507 update_engine[1903]: I20250904 00:56:15.924381 1903 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 00:56:15.925335 update_engine[1903]: I20250904 00:56:15.924932 1903 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 00:56:15.925686 update_engine[1903]: I20250904 00:56:15.925616 1903 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 00:56:15.926307 update_engine[1903]: E20250904 00:56:15.926219 1903 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 00:56:15.926449 update_engine[1903]: I20250904 00:56:15.926391 1903 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 4 00:56:17.957898 containerd[1910]: time="2025-09-04T00:56:17.957871552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"8eb099f2eda3b8a7aabbf2901f8977d6e59015009512be81aa3c2e157ded8f45\" pid:6990 exited_at:{seconds:1756947377 nanos:957673463}" Sep 4 00:56:25.924371 update_engine[1903]: I20250904 00:56:25.924237 1903 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 00:56:25.925307 update_engine[1903]: I20250904 00:56:25.924775 1903 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 00:56:25.925561 update_engine[1903]: I20250904 00:56:25.925465 1903 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 00:56:25.925853 update_engine[1903]: E20250904 00:56:25.925761 1903 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 00:56:25.926044 update_engine[1903]: I20250904 00:56:25.925858 1903 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 4 00:56:25.926044 update_engine[1903]: I20250904 00:56:25.925882 1903 omaha_request_action.cc:617] Omaha request response: Sep 4 00:56:25.926329 update_engine[1903]: E20250904 00:56:25.926054 1903 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926098 1903 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926137 1903 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926157 1903 update_attempter.cc:306] Processing Done. Sep 4 00:56:25.926329 update_engine[1903]: E20250904 00:56:25.926187 1903 update_attempter.cc:619] Update failed. Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926203 1903 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926216 1903 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 4 00:56:25.926329 update_engine[1903]: I20250904 00:56:25.926231 1903 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 4 00:56:25.927004 update_engine[1903]: I20250904 00:56:25.926386 1903 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 4 00:56:25.927004 update_engine[1903]: I20250904 00:56:25.926448 1903 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 4 00:56:25.927004 update_engine[1903]: I20250904 00:56:25.926465 1903 omaha_request_action.cc:272] Request: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: Sep 4 00:56:25.927004 update_engine[1903]: I20250904 00:56:25.926481 1903 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 4 00:56:25.927004 update_engine[1903]: I20250904 00:56:25.926870 1903 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 4 00:56:25.927888 update_engine[1903]: I20250904 00:56:25.927413 1903 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 4 00:56:25.927888 update_engine[1903]: E20250904 00:56:25.927780 1903 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 4 00:56:25.928072 locksmithd[1986]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927901 1903 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927927 1903 omaha_request_action.cc:617] Omaha request response: Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927944 1903 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927958 1903 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927971 1903 update_attempter.cc:306] Processing Done. Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.927987 1903 update_attempter.cc:310] Error event sent. Sep 4 00:56:25.928725 update_engine[1903]: I20250904 00:56:25.928009 1903 update_check_scheduler.cc:74] Next update check in 45m57s Sep 4 00:56:25.929196 locksmithd[1986]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 4 00:56:28.518397 systemd[1]: Started sshd@15-147.28.180.77:22-43.156.45.213:35292.service - OpenSSH per-connection server daemon (43.156.45.213:35292). Sep 4 00:56:29.533585 sshd[7017]: Invalid user postgres from 43.156.45.213 port 35292 Sep 4 00:56:29.720373 sshd[7017]: Received disconnect from 43.156.45.213 port 35292:11: Bye Bye [preauth] Sep 4 00:56:29.720373 sshd[7017]: Disconnected from invalid user postgres 43.156.45.213 port 35292 [preauth] Sep 4 00:56:29.725594 systemd[1]: sshd@15-147.28.180.77:22-43.156.45.213:35292.service: Deactivated successfully. Sep 4 00:56:31.133228 containerd[1910]: time="2025-09-04T00:56:31.133200584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"6305ecac4f9f465bb22337a32ac5f92bc47d4c70b460a7dc26d081d59857af35\" pid:7033 exited_at:{seconds:1756947391 nanos:133069445}" Sep 4 00:56:34.941001 containerd[1910]: time="2025-09-04T00:56:34.940979545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"f25955273479f7e641d3307e283f833459b0729300170821be7e83d3eb5cfd62\" pid:7057 exited_at:{seconds:1756947394 nanos:940871782}" Sep 4 00:56:35.957412 containerd[1910]: time="2025-09-04T00:56:35.957383903Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"f56dfcfcc0ea98dde8988d80366b3c3391ac296ca00ca47fc86a1cb42b6cbf8e\" pid:7078 exited_at:{seconds:1756947395 nanos:957193890}" Sep 4 00:56:47.997688 containerd[1910]: time="2025-09-04T00:56:47.997623284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"352babe02e1bdeb844d52099bf7e35faa3bd58c29c98fcb67e8126ae3d3e02b6\" pid:7118 exited_at:{seconds:1756947407 nanos:997359344}" Sep 4 00:56:51.057947 containerd[1910]: time="2025-09-04T00:56:51.057922500Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"71fff50b82db81181fc0edf24b3c0bbdd4be1a7b9ffe301c364698256e9f62f0\" pid:7153 exited_at:{seconds:1756947411 nanos:57720786}" Sep 4 00:57:01.129277 containerd[1910]: time="2025-09-04T00:57:01.129224206Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"a0289860cc7875dcdcb70f65d296b91aabcba31b609a4826bc39f42cae1fa9c3\" pid:7188 exited_at:{seconds:1756947421 nanos:128982695}" Sep 4 00:57:05.958275 containerd[1910]: time="2025-09-04T00:57:05.958225576Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"b0ff839bfd11a28de3ea787500ef9ffa4dea3d9ab5702a3d987254b63884d5e9\" pid:7211 exited_at:{seconds:1756947425 nanos:958046245}" Sep 4 00:57:07.406852 systemd[1]: Started sshd@16-147.28.180.77:22-183.23.62.16:64919.service - OpenSSH per-connection server daemon (183.23.62.16:64919). Sep 4 00:57:08.315346 sshd[7234]: Invalid user vnc from 183.23.62.16 port 64919 Sep 4 00:57:08.481788 sshd[7234]: Received disconnect from 183.23.62.16 port 64919:11: Bye Bye [preauth] Sep 4 00:57:08.481788 sshd[7234]: Disconnected from invalid user vnc 183.23.62.16 port 64919 [preauth] Sep 4 00:57:08.486540 systemd[1]: sshd@16-147.28.180.77:22-183.23.62.16:64919.service: Deactivated successfully. Sep 4 00:57:12.915840 systemd[1]: sshd@11-147.28.180.77:22-183.23.62.16:41625.service: Deactivated successfully. Sep 4 00:57:17.962333 containerd[1910]: time="2025-09-04T00:57:17.962307233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"e88d27d96de794c350bf78718fa72f5384cff2ac175528e58b52255e00d90295\" pid:7275 exited_at:{seconds:1756947437 nanos:962046224}" Sep 4 00:57:31.128661 containerd[1910]: time="2025-09-04T00:57:31.128611635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"782a7ae9b77a63cbcfa0e54eff160adbe5b8dee04f05ded6febf55946812effb\" pid:7314 exited_at:{seconds:1756947451 nanos:128498590}" Sep 4 00:57:34.959986 containerd[1910]: time="2025-09-04T00:57:34.959952295Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"c1c4b484054dee58b8de47d56acd6289607ee49eb115568187230cf1d4acb44d\" pid:7337 exited_at:{seconds:1756947454 nanos:959773681}" Sep 4 00:57:35.955904 containerd[1910]: time="2025-09-04T00:57:35.955874533Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"cf9c0bba05a90f72c1bf7063975275132fed5099e27023ca38ee9cce4a503e24\" pid:7359 exited_at:{seconds:1756947455 nanos:955696481}" Sep 4 00:57:47.974550 containerd[1910]: time="2025-09-04T00:57:47.974521669Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"9371943524f59ec51363359171c1e75d4367551a738d3a24e4f0978716a789eb\" pid:7394 exited_at:{seconds:1756947467 nanos:974249904}" Sep 4 00:57:51.057479 containerd[1910]: time="2025-09-04T00:57:51.057416075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"6378504f1e7c8e54ed203d880cb16b69cee46a993c4c877979eef0312dfe0390\" pid:7428 exited_at:{seconds:1756947471 nanos:57225721}" Sep 4 00:58:01.130688 containerd[1910]: time="2025-09-04T00:58:01.130658594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"b5e1c84bcaecdebf1bf122d83dcb9db732b8a884880d7e5971ae4a05d649f122\" pid:7463 exited_at:{seconds:1756947481 nanos:130401805}" Sep 4 00:58:05.613949 systemd[1]: Started sshd@17-147.28.180.77:22-183.23.62.16:57430.service - OpenSSH per-connection server daemon (183.23.62.16:57430). Sep 4 00:58:05.959755 containerd[1910]: time="2025-09-04T00:58:05.959729720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"a600f890615b460d98d5ab4b52c453f09d21083b83b24a68260679c2399df98c\" pid:7488 exited_at:{seconds:1756947485 nanos:959545241}" Sep 4 00:58:07.296777 sshd[7476]: Invalid user dudu from 183.23.62.16 port 57430 Sep 4 00:58:07.457539 sshd[7476]: Received disconnect from 183.23.62.16 port 57430:11: Bye Bye [preauth] Sep 4 00:58:07.457539 sshd[7476]: Disconnected from invalid user dudu 183.23.62.16 port 57430 [preauth] Sep 4 00:58:07.462282 systemd[1]: sshd@17-147.28.180.77:22-183.23.62.16:57430.service: Deactivated successfully. Sep 4 00:58:13.762375 systemd[1]: sshd@14-147.28.180.77:22-183.23.62.16:45972.service: Deactivated successfully. Sep 4 00:58:14.246984 systemd[1]: Started sshd@18-147.28.180.77:22-43.156.45.213:42416.service - OpenSSH per-connection server daemon (43.156.45.213:42416). Sep 4 00:58:15.469291 sshd[7517]: Received disconnect from 43.156.45.213 port 42416:11: Bye Bye [preauth] Sep 4 00:58:15.469291 sshd[7517]: Disconnected from authenticating user root 43.156.45.213 port 42416 [preauth] Sep 4 00:58:15.472995 systemd[1]: sshd@18-147.28.180.77:22-43.156.45.213:42416.service: Deactivated successfully. Sep 4 00:58:17.955149 containerd[1910]: time="2025-09-04T00:58:17.955123567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"42e30fbad26008d46dfb8534bfd592098496eefc3cde19731aa869bc8194fb30\" pid:7532 exited_at:{seconds:1756947497 nanos:954926878}" Sep 4 00:58:24.473536 containerd[1910]: time="2025-09-04T00:58:24.473291902Z" level=warning msg="container event discarded" container=0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7 type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.473536 containerd[1910]: time="2025-09-04T00:58:24.473466954Z" level=warning msg="container event discarded" container=0855dc70bec7cac3b7aa3cb83db74377db5c7611db0411b78cfceaaa063aa6a7 type=CONTAINER_STARTED_EVENT Sep 4 00:58:24.484977 containerd[1910]: time="2025-09-04T00:58:24.484826939Z" level=warning msg="container event discarded" container=4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743 type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.484977 containerd[1910]: time="2025-09-04T00:58:24.484931784Z" level=warning msg="container event discarded" container=4b6e7b0cb45eb338a95475a1f4cb5770069c1c8b41bc2e5f75d65c80281e3743 type=CONTAINER_STARTED_EVENT Sep 4 00:58:24.484977 containerd[1910]: time="2025-09-04T00:58:24.484964522Z" level=warning msg="container event discarded" container=765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8 type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.485390 containerd[1910]: time="2025-09-04T00:58:24.484988188Z" level=warning msg="container event discarded" container=765f08fd4ef39f9f559ad7ac57022a983bc3d502e06fe8ace192733335bfa5e8 type=CONTAINER_STARTED_EVENT Sep 4 00:58:24.485390 containerd[1910]: time="2025-09-04T00:58:24.485009428Z" level=warning msg="container event discarded" container=acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002 type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.485390 containerd[1910]: time="2025-09-04T00:58:24.485031521Z" level=warning msg="container event discarded" container=7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7 type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.485390 containerd[1910]: time="2025-09-04T00:58:24.485052696Z" level=warning msg="container event discarded" container=fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee type=CONTAINER_CREATED_EVENT Sep 4 00:58:24.548524 containerd[1910]: time="2025-09-04T00:58:24.548358678Z" level=warning msg="container event discarded" container=acbff79358a2fa2d55066b22804c9f9d7434e34c9482a28840443aa13dcd3002 type=CONTAINER_STARTED_EVENT Sep 4 00:58:24.548524 containerd[1910]: time="2025-09-04T00:58:24.548464783Z" level=warning msg="container event discarded" container=7f93fb03a407a049cd92e9fae9cdb1eb9ee291c1b641a7b01f99d35e7fe083f7 type=CONTAINER_STARTED_EVENT Sep 4 00:58:24.548524 containerd[1910]: time="2025-09-04T00:58:24.548494101Z" level=warning msg="container event discarded" container=fde7f4c58a43de1fb6ad21e348b8b99a387e8cf2e3239f05652dbde24d8bebee type=CONTAINER_STARTED_EVENT Sep 4 00:58:31.180926 containerd[1910]: time="2025-09-04T00:58:31.180890270Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"e5f92477dac0eb14a5d69f15e269fa34b4d83d840ef9ed71cfc80b56aaac9206\" pid:7570 exited_at:{seconds:1756947511 nanos:180676585}" Sep 4 00:58:34.089537 containerd[1910]: time="2025-09-04T00:58:34.089357210Z" level=warning msg="container event discarded" container=eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e type=CONTAINER_CREATED_EVENT Sep 4 00:58:34.089537 containerd[1910]: time="2025-09-04T00:58:34.089480143Z" level=warning msg="container event discarded" container=eaab69eef48a8c1283dc5df5c30dbd8f3aedc61c10d392b04d591d99eaabb96e type=CONTAINER_STARTED_EVENT Sep 4 00:58:34.451091 containerd[1910]: time="2025-09-04T00:58:34.450934717Z" level=warning msg="container event discarded" container=f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7 type=CONTAINER_CREATED_EVENT Sep 4 00:58:34.499437 containerd[1910]: time="2025-09-04T00:58:34.499338173Z" level=warning msg="container event discarded" container=f93b5ac21c42ac796808befb499e4da395bcd55765e672c12a31128b2a2eddd7 type=CONTAINER_STARTED_EVENT Sep 4 00:58:34.499437 containerd[1910]: time="2025-09-04T00:58:34.499427565Z" level=warning msg="container event discarded" container=77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9 type=CONTAINER_CREATED_EVENT Sep 4 00:58:34.499816 containerd[1910]: time="2025-09-04T00:58:34.499459009Z" level=warning msg="container event discarded" container=77705c20a9d0f1e11f8f58126bd7b1b72ccd517cad42947808caccadefb21ea9 type=CONTAINER_STARTED_EVENT Sep 4 00:58:34.949348 containerd[1910]: time="2025-09-04T00:58:34.949315400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"a44514c3fc83b81090b82859739c9bf5746632066c01b7d18c8c6d14e99cfefc\" pid:7593 exited_at:{seconds:1756947514 nanos:949143239}" Sep 4 00:58:35.957059 containerd[1910]: time="2025-09-04T00:58:35.957033878Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"e57f2b7334bc34210e2d2b593f36d53745e026f88b4534489dbca9b3430e6812\" pid:7616 exited_at:{seconds:1756947515 nanos:956838214}" Sep 4 00:58:36.101424 containerd[1910]: time="2025-09-04T00:58:36.101252101Z" level=warning msg="container event discarded" container=ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3 type=CONTAINER_CREATED_EVENT Sep 4 00:58:36.139053 containerd[1910]: time="2025-09-04T00:58:36.138892971Z" level=warning msg="container event discarded" container=ca5eaa748ca638e8e3b99cd795418b474c45e910187cfe78ea4745c09500c5b3 type=CONTAINER_STARTED_EVENT Sep 4 00:58:43.041562 containerd[1910]: time="2025-09-04T00:58:43.041400660Z" level=warning msg="container event discarded" container=81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697 type=CONTAINER_CREATED_EVENT Sep 4 00:58:43.041562 containerd[1910]: time="2025-09-04T00:58:43.041516259Z" level=warning msg="container event discarded" container=81cc13c06073977d99b6a85fd839300264efb6c652b0a9dcdb6080c12e8fa697 type=CONTAINER_STARTED_EVENT Sep 4 00:58:43.353505 containerd[1910]: time="2025-09-04T00:58:43.353272632Z" level=warning msg="container event discarded" container=1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15 type=CONTAINER_CREATED_EVENT Sep 4 00:58:43.353505 containerd[1910]: time="2025-09-04T00:58:43.353361276Z" level=warning msg="container event discarded" container=1d048d7d9d95cc09650145adbc34bd2daf998b076912fcbb48dc360ca3840b15 type=CONTAINER_STARTED_EVENT Sep 4 00:58:45.270026 containerd[1910]: time="2025-09-04T00:58:45.269889996Z" level=warning msg="container event discarded" container=217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b type=CONTAINER_CREATED_EVENT Sep 4 00:58:45.320524 containerd[1910]: time="2025-09-04T00:58:45.320368069Z" level=warning msg="container event discarded" container=217c720ad862cdff0b964984c0373e9ce2bd6caa4606d5668fae24c8608d9e9b type=CONTAINER_STARTED_EVENT Sep 4 00:58:46.834998 containerd[1910]: time="2025-09-04T00:58:46.834871901Z" level=warning msg="container event discarded" container=4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc type=CONTAINER_CREATED_EVENT Sep 4 00:58:46.868454 containerd[1910]: time="2025-09-04T00:58:46.868304877Z" level=warning msg="container event discarded" container=4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc type=CONTAINER_STARTED_EVENT Sep 4 00:58:47.812510 containerd[1910]: time="2025-09-04T00:58:47.812398615Z" level=warning msg="container event discarded" container=4b7bc343479756ed4ad791a53340cc055c966dc684aeb39d3da771b0912437cc type=CONTAINER_STOPPED_EVENT Sep 4 00:58:47.969592 containerd[1910]: time="2025-09-04T00:58:47.969540334Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"b23d50a3c99fade204b2418f98b8090a67d953f2561307d1a303fa206a292eb9\" pid:7669 exited_at:{seconds:1756947527 nanos:969321292}" Sep 4 00:58:51.056216 containerd[1910]: time="2025-09-04T00:58:51.056188995Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"a7bd583d14c68274703e567edc7dbf6bc3f31407f429cf4f2c570103e0d493cf\" pid:7711 exited_at:{seconds:1756947531 nanos:56001094}" Sep 4 00:58:51.438556 containerd[1910]: time="2025-09-04T00:58:51.438401513Z" level=warning msg="container event discarded" container=dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f type=CONTAINER_CREATED_EVENT Sep 4 00:58:51.510193 containerd[1910]: time="2025-09-04T00:58:51.510039581Z" level=warning msg="container event discarded" container=dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f type=CONTAINER_STARTED_EVENT Sep 4 00:58:52.534475 containerd[1910]: time="2025-09-04T00:58:52.534343892Z" level=warning msg="container event discarded" container=dd3967bcbf8d94d6d055293135df6a7170dacbb2163108cf8c9384dd3ef33e8f type=CONTAINER_STOPPED_EVENT Sep 4 00:58:56.701435 systemd[1]: Started sshd@19-147.28.180.77:22-103.10.45.57:40128.service - OpenSSH per-connection server daemon (103.10.45.57:40128). Sep 4 00:58:57.847072 sshd[7733]: Invalid user larissa from 103.10.45.57 port 40128 Sep 4 00:58:58.062342 sshd[7733]: Received disconnect from 103.10.45.57 port 40128:11: Bye Bye [preauth] Sep 4 00:58:58.062342 sshd[7733]: Disconnected from invalid user larissa 103.10.45.57 port 40128 [preauth] Sep 4 00:58:58.065767 systemd[1]: sshd@19-147.28.180.77:22-103.10.45.57:40128.service: Deactivated successfully. Sep 4 00:58:58.350106 containerd[1910]: time="2025-09-04T00:58:58.349974190Z" level=warning msg="container event discarded" container=4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad type=CONTAINER_CREATED_EVENT Sep 4 00:58:58.396518 containerd[1910]: time="2025-09-04T00:58:58.396415501Z" level=warning msg="container event discarded" container=4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad type=CONTAINER_STARTED_EVENT Sep 4 00:58:59.879473 containerd[1910]: time="2025-09-04T00:58:59.879358393Z" level=warning msg="container event discarded" container=1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635 type=CONTAINER_CREATED_EVENT Sep 4 00:58:59.879473 containerd[1910]: time="2025-09-04T00:58:59.879457369Z" level=warning msg="container event discarded" container=1477b22787c77dd488ce991cfedfbc073cc24b9ed90df816611845c1a7e4e635 type=CONTAINER_STARTED_EVENT Sep 4 00:59:01.166592 containerd[1910]: time="2025-09-04T00:59:01.166557458Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"4fb05a103fcfb0162015af3a8561186a922b0ade162f8e30955da16a54e78b65\" pid:7748 exited_at:{seconds:1756947541 nanos:166396622}" Sep 4 00:59:01.402984 containerd[1910]: time="2025-09-04T00:59:01.402877511Z" level=warning msg="container event discarded" container=d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4 type=CONTAINER_CREATED_EVENT Sep 4 00:59:01.460629 containerd[1910]: time="2025-09-04T00:59:01.460319686Z" level=warning msg="container event discarded" container=d4347902762c6dd0b0ca62c2c884d35d352eaf74ce2ba798b0ab7b59d3c02ce4 type=CONTAINER_STARTED_EVENT Sep 4 00:59:03.762474 containerd[1910]: time="2025-09-04T00:59:03.762346675Z" level=warning msg="container event discarded" container=a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f type=CONTAINER_CREATED_EVENT Sep 4 00:59:03.882086 containerd[1910]: time="2025-09-04T00:59:03.881940388Z" level=warning msg="container event discarded" container=a9c015e949fe233661c3c7eb254b1803b2c29dd2aa8bc52bb7102db3d897dc8f type=CONTAINER_STARTED_EVENT Sep 4 00:59:04.425558 containerd[1910]: time="2025-09-04T00:59:04.425366173Z" level=warning msg="container event discarded" container=77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a type=CONTAINER_CREATED_EVENT Sep 4 00:59:04.425558 containerd[1910]: time="2025-09-04T00:59:04.425496339Z" level=warning msg="container event discarded" container=77aab746d0f6d2267512dd98362cb0deb3d404f7a1ec6479cc9555cdd2f72f8a type=CONTAINER_STARTED_EVENT Sep 4 00:59:04.425558 containerd[1910]: time="2025-09-04T00:59:04.425523759Z" level=warning msg="container event discarded" container=7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22 type=CONTAINER_CREATED_EVENT Sep 4 00:59:04.463019 containerd[1910]: time="2025-09-04T00:59:04.462847933Z" level=warning msg="container event discarded" container=7e6d1c4ee3e37c27f3010f05a67820e9f09bc69012ee6907436902d5c8386c22 type=CONTAINER_STARTED_EVENT Sep 4 00:59:04.527545 containerd[1910]: time="2025-09-04T00:59:04.527365008Z" level=warning msg="container event discarded" container=db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88 type=CONTAINER_CREATED_EVENT Sep 4 00:59:04.527545 containerd[1910]: time="2025-09-04T00:59:04.527468553Z" level=warning msg="container event discarded" container=db7b023381536b6699bfc363a312cbc1e32623e76f8a416929887b14dd282f88 type=CONTAINER_STARTED_EVENT Sep 4 00:59:04.527545 containerd[1910]: time="2025-09-04T00:59:04.527500577Z" level=warning msg="container event discarded" container=f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc type=CONTAINER_CREATED_EVENT Sep 4 00:59:04.573929 containerd[1910]: time="2025-09-04T00:59:04.573764996Z" level=warning msg="container event discarded" container=f4b492210bfebd9a3d0e0335a595feda1b3b341e9933cff3bdb68e9772ef93dc type=CONTAINER_STARTED_EVENT Sep 4 00:59:05.491717 containerd[1910]: time="2025-09-04T00:59:05.491569412Z" level=warning msg="container event discarded" container=e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242 type=CONTAINER_CREATED_EVENT Sep 4 00:59:05.491717 containerd[1910]: time="2025-09-04T00:59:05.491661779Z" level=warning msg="container event discarded" container=e02ed465f3f05e5460cef43b703047fca6262eb5c30da9a6551b66f3c093b242 type=CONTAINER_STARTED_EVENT Sep 4 00:59:05.533207 containerd[1910]: time="2025-09-04T00:59:05.533038391Z" level=warning msg="container event discarded" container=98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f type=CONTAINER_CREATED_EVENT Sep 4 00:59:05.533207 containerd[1910]: time="2025-09-04T00:59:05.533169498Z" level=warning msg="container event discarded" container=98ebf753033136f9ff38e9e4285dfaa39ccaa3e0aa191d4bec84de7a24288c9f type=CONTAINER_STARTED_EVENT Sep 4 00:59:05.954600 containerd[1910]: time="2025-09-04T00:59:05.954575837Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"a68ca312ee9aedc3545116d38fce16f4b976845ab7ea42254e26760941ae04e8\" pid:7771 exited_at:{seconds:1756947545 nanos:954373773}" Sep 4 00:59:06.046363 systemd[1]: Started sshd@20-147.28.180.77:22-183.23.62.16:26849.service - OpenSSH per-connection server daemon (183.23.62.16:26849). Sep 4 00:59:06.444395 containerd[1910]: time="2025-09-04T00:59:06.444259165Z" level=warning msg="container event discarded" container=a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1 type=CONTAINER_CREATED_EVENT Sep 4 00:59:06.444395 containerd[1910]: time="2025-09-04T00:59:06.444339532Z" level=warning msg="container event discarded" container=a1ccedd0c487b67e202a071316c2a462d21859da70685d9e47a932f8f7474dd1 type=CONTAINER_STARTED_EVENT Sep 4 00:59:07.430458 containerd[1910]: time="2025-09-04T00:59:07.430312458Z" level=warning msg="container event discarded" container=23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c type=CONTAINER_CREATED_EVENT Sep 4 00:59:07.430458 containerd[1910]: time="2025-09-04T00:59:07.430395668Z" level=warning msg="container event discarded" container=23577f4bb9985effd1e7c7a38f0972b07b909e010179c1c997b5eee60562637c type=CONTAINER_STARTED_EVENT Sep 4 00:59:07.579932 containerd[1910]: time="2025-09-04T00:59:07.579782110Z" level=warning msg="container event discarded" container=b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb type=CONTAINER_CREATED_EVENT Sep 4 00:59:07.579932 containerd[1910]: time="2025-09-04T00:59:07.579882732Z" level=warning msg="container event discarded" container=b138cc9887b84557389d783bb656e16e0b5681127ccba10f93273ad5a182c4cb type=CONTAINER_STARTED_EVENT Sep 4 00:59:08.027054 containerd[1910]: time="2025-09-04T00:59:08.026911713Z" level=warning msg="container event discarded" container=2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c type=CONTAINER_CREATED_EVENT Sep 4 00:59:08.081497 containerd[1910]: time="2025-09-04T00:59:08.081342510Z" level=warning msg="container event discarded" container=2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c type=CONTAINER_STARTED_EVENT Sep 4 00:59:09.533339 containerd[1910]: time="2025-09-04T00:59:09.533196777Z" level=warning msg="container event discarded" container=c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389 type=CONTAINER_CREATED_EVENT Sep 4 00:59:09.571578 containerd[1910]: time="2025-09-04T00:59:09.571523404Z" level=warning msg="container event discarded" container=c154c351bf5c3559c6b19d21a2df96750d3b6bc7268071d6a230e55e5b7bc389 type=CONTAINER_STARTED_EVENT Sep 4 00:59:12.029408 containerd[1910]: time="2025-09-04T00:59:12.029259615Z" level=warning msg="container event discarded" container=0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f type=CONTAINER_CREATED_EVENT Sep 4 00:59:12.090942 containerd[1910]: time="2025-09-04T00:59:12.090801056Z" level=warning msg="container event discarded" container=0e64006b6239388908b5108a8c45bb65543839f65de56d302a79b2a401d35a1f type=CONTAINER_STARTED_EVENT Sep 4 00:59:12.501375 containerd[1910]: time="2025-09-04T00:59:12.501230455Z" level=warning msg="container event discarded" container=f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2 type=CONTAINER_CREATED_EVENT Sep 4 00:59:12.557718 containerd[1910]: time="2025-09-04T00:59:12.557653891Z" level=warning msg="container event discarded" container=f922a85994497fa190e19e3151946b32c0dec9b42f1d73efb56568c8ea013ba2 type=CONTAINER_STARTED_EVENT Sep 4 00:59:15.106601 containerd[1910]: time="2025-09-04T00:59:15.106438611Z" level=warning msg="container event discarded" container=0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80 type=CONTAINER_CREATED_EVENT Sep 4 00:59:15.160033 containerd[1910]: time="2025-09-04T00:59:15.159892284Z" level=warning msg="container event discarded" container=0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80 type=CONTAINER_STARTED_EVENT Sep 4 00:59:16.892283 containerd[1910]: time="2025-09-04T00:59:16.892102361Z" level=warning msg="container event discarded" container=975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8 type=CONTAINER_CREATED_EVENT Sep 4 00:59:16.927740 containerd[1910]: time="2025-09-04T00:59:16.927574335Z" level=warning msg="container event discarded" container=975a3209810d6a145665467996674921d2f4092bba1b54ebd119a1d9907b8cf8 type=CONTAINER_STARTED_EVENT Sep 4 00:59:17.954336 containerd[1910]: time="2025-09-04T00:59:17.954313806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"9d2eb095863c2da2ec374927d6cc76db341755be8b09235b6cd27ed80cd4a530\" pid:7805 exited_at:{seconds:1756947557 nanos:954123420}" Sep 4 00:59:31.135523 containerd[1910]: time="2025-09-04T00:59:31.135488505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"b0942f1c661e0bfa86d8bfe111db43c61d641535bb138a60ce9fc79a3f464e29\" pid:7849 exited_at:{seconds:1756947571 nanos:135325853}" Sep 4 00:59:34.990417 containerd[1910]: time="2025-09-04T00:59:34.990388426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"c376aec7414ba0bcd3a514139de3a2a33f80d9effc2de02cf0eea91fb2b9a8f1\" pid:7873 exited_at:{seconds:1756947574 nanos:990224549}" Sep 4 00:59:36.011796 containerd[1910]: time="2025-09-04T00:59:36.011768888Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"c5d5127305b13510b20022ba6d88e2145e25712cf6228f5b74710288f76949a3\" pid:7894 exited_at:{seconds:1756947576 nanos:11573279}" Sep 4 00:59:48.010109 containerd[1910]: time="2025-09-04T00:59:48.010052811Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"c9eed0bb4787a306b218776c1b49aca11cc12db7440d2f443a89482e2c11e827\" pid:7927 exited_at:{seconds:1756947588 nanos:9819512}" Sep 4 00:59:51.054734 containerd[1910]: time="2025-09-04T00:59:51.054681016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"e90f7bdd1836aa4e56f81c12e0c8893bf9ec8782965a28c1662f6e4a7464b65c\" pid:7962 exited_at:{seconds:1756947591 nanos:54491433}" Sep 4 01:00:01.195051 containerd[1910]: time="2025-09-04T01:00:01.195022900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"ac784397846d1199f09bbea2a264d2ca89fdcce5da8c0e1d46b92f53e9323ede\" pid:7997 exited_at:{seconds:1756947601 nanos:194885714}" Sep 4 01:00:05.950624 containerd[1910]: time="2025-09-04T01:00:05.950561655Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"fd46808f704a19dbfa5736f221bf0f651b889ef7dfd475995283d865a36257a9\" pid:8022 exited_at:{seconds:1756947605 nanos:950305561}" Sep 4 01:00:07.183483 systemd[1]: Started sshd@21-147.28.180.77:22-183.23.62.16:60444.service - OpenSSH per-connection server daemon (183.23.62.16:60444). Sep 4 01:00:16.197939 systemd[1]: Started sshd@22-147.28.180.77:22-103.10.45.57:41104.service - OpenSSH per-connection server daemon (103.10.45.57:41104). Sep 4 01:00:17.357779 sshd[8046]: Invalid user scadaadmin from 103.10.45.57 port 41104 Sep 4 01:00:17.578768 sshd[8046]: Received disconnect from 103.10.45.57 port 41104:11: Bye Bye [preauth] Sep 4 01:00:17.578768 sshd[8046]: Disconnected from invalid user scadaadmin 103.10.45.57 port 41104 [preauth] Sep 4 01:00:17.583599 systemd[1]: sshd@22-147.28.180.77:22-103.10.45.57:41104.service: Deactivated successfully. Sep 4 01:00:17.949861 containerd[1910]: time="2025-09-04T01:00:17.949823188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"9b679cb85b8c8247df7fead59f94d43c5268a00787c21c5bf6b30d85f872cfd1\" pid:8062 exited_at:{seconds:1756947617 nanos:949657248}" Sep 4 01:00:18.150352 systemd[1]: Started sshd@23-147.28.180.77:22-147.75.109.163:59606.service - OpenSSH per-connection server daemon (147.75.109.163:59606). Sep 4 01:00:18.240950 sshd[8091]: Accepted publickey for core from 147.75.109.163 port 59606 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:18.244299 sshd-session[8091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:18.258104 systemd-logind[1901]: New session 12 of user core. Sep 4 01:00:18.277589 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 01:00:18.420946 sshd[8093]: Connection closed by 147.75.109.163 port 59606 Sep 4 01:00:18.421137 sshd-session[8091]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:18.423070 systemd[1]: sshd@23-147.28.180.77:22-147.75.109.163:59606.service: Deactivated successfully. Sep 4 01:00:18.424178 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 01:00:18.425021 systemd-logind[1901]: Session 12 logged out. Waiting for processes to exit. Sep 4 01:00:18.425842 systemd-logind[1901]: Removed session 12. Sep 4 01:00:23.448904 systemd[1]: Started sshd@24-147.28.180.77:22-147.75.109.163:33226.service - OpenSSH per-connection server daemon (147.75.109.163:33226). Sep 4 01:00:23.514654 sshd[8143]: Accepted publickey for core from 147.75.109.163 port 33226 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:23.515857 sshd-session[8143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:23.520630 systemd-logind[1901]: New session 13 of user core. Sep 4 01:00:23.530291 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 01:00:23.614083 sshd[8145]: Connection closed by 147.75.109.163 port 33226 Sep 4 01:00:23.614260 sshd-session[8143]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:23.616336 systemd[1]: sshd@24-147.28.180.77:22-147.75.109.163:33226.service: Deactivated successfully. Sep 4 01:00:23.617232 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 01:00:23.617651 systemd-logind[1901]: Session 13 logged out. Waiting for processes to exit. Sep 4 01:00:23.618211 systemd-logind[1901]: Removed session 13. Sep 4 01:00:28.642093 systemd[1]: Started sshd@25-147.28.180.77:22-147.75.109.163:33230.service - OpenSSH per-connection server daemon (147.75.109.163:33230). Sep 4 01:00:28.726506 sshd[8180]: Accepted publickey for core from 147.75.109.163 port 33230 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:28.727409 sshd-session[8180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:28.731160 systemd-logind[1901]: New session 14 of user core. Sep 4 01:00:28.743702 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 01:00:28.839093 sshd[8182]: Connection closed by 147.75.109.163 port 33230 Sep 4 01:00:28.839274 sshd-session[8180]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:28.841271 systemd[1]: sshd@25-147.28.180.77:22-147.75.109.163:33230.service: Deactivated successfully. Sep 4 01:00:28.842523 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 01:00:28.843456 systemd-logind[1901]: Session 14 logged out. Waiting for processes to exit. Sep 4 01:00:28.844189 systemd-logind[1901]: Removed session 14. Sep 4 01:00:31.127649 containerd[1910]: time="2025-09-04T01:00:31.127626535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"4656d3e50be84a77aa6a36b0ffbab7831a99c99f2e47265e5243a30697ddbe83\" pid:8218 exited_at:{seconds:1756947631 nanos:127521815}" Sep 4 01:00:33.862311 systemd[1]: Started sshd@26-147.28.180.77:22-147.75.109.163:58010.service - OpenSSH per-connection server daemon (147.75.109.163:58010). Sep 4 01:00:33.912621 sshd[8229]: Accepted publickey for core from 147.75.109.163 port 58010 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:33.913400 sshd-session[8229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:33.916750 systemd-logind[1901]: New session 15 of user core. Sep 4 01:00:33.928595 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 01:00:34.022121 sshd[8231]: Connection closed by 147.75.109.163 port 58010 Sep 4 01:00:34.022323 sshd-session[8229]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:34.024148 systemd[1]: sshd@26-147.28.180.77:22-147.75.109.163:58010.service: Deactivated successfully. Sep 4 01:00:34.025193 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 01:00:34.025943 systemd-logind[1901]: Session 15 logged out. Waiting for processes to exit. Sep 4 01:00:34.026798 systemd-logind[1901]: Removed session 15. Sep 4 01:00:34.945899 containerd[1910]: time="2025-09-04T01:00:34.945873164Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"ea7fd6495ccba2fa9c764ee934bfcdbff1bbdb31517665235e1fd97ce74623d5\" pid:8269 exited_at:{seconds:1756947634 nanos:945750316}" Sep 4 01:00:35.968886 containerd[1910]: time="2025-09-04T01:00:35.968859288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"9a3adaac52c7aa5654198c03e02979b0c863071b00ca376176e4e6a0c23fa04c\" pid:8290 exited_at:{seconds:1756947635 nanos:968680133}" Sep 4 01:00:39.048321 systemd[1]: Started sshd@27-147.28.180.77:22-147.75.109.163:58014.service - OpenSSH per-connection server daemon (147.75.109.163:58014). Sep 4 01:00:39.092036 sshd[8314]: Accepted publickey for core from 147.75.109.163 port 58014 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:39.092765 sshd-session[8314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:39.095893 systemd-logind[1901]: New session 16 of user core. Sep 4 01:00:39.106266 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 01:00:39.196312 sshd[8316]: Connection closed by 147.75.109.163 port 58014 Sep 4 01:00:39.196492 sshd-session[8314]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:39.209280 systemd[1]: sshd@27-147.28.180.77:22-147.75.109.163:58014.service: Deactivated successfully. Sep 4 01:00:39.210105 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 01:00:39.210590 systemd-logind[1901]: Session 16 logged out. Waiting for processes to exit. Sep 4 01:00:39.211840 systemd[1]: Started sshd@28-147.28.180.77:22-147.75.109.163:58018.service - OpenSSH per-connection server daemon (147.75.109.163:58018). Sep 4 01:00:39.212217 systemd-logind[1901]: Removed session 16. Sep 4 01:00:39.244136 sshd[8341]: Accepted publickey for core from 147.75.109.163 port 58018 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:39.244810 sshd-session[8341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:39.247869 systemd-logind[1901]: New session 17 of user core. Sep 4 01:00:39.256316 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 01:00:39.355335 sshd[8343]: Connection closed by 147.75.109.163 port 58018 Sep 4 01:00:39.355457 sshd-session[8341]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:39.366493 systemd[1]: sshd@28-147.28.180.77:22-147.75.109.163:58018.service: Deactivated successfully. Sep 4 01:00:39.367471 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 01:00:39.367858 systemd-logind[1901]: Session 17 logged out. Waiting for processes to exit. Sep 4 01:00:39.369239 systemd[1]: Started sshd@29-147.28.180.77:22-147.75.109.163:58022.service - OpenSSH per-connection server daemon (147.75.109.163:58022). Sep 4 01:00:39.369761 systemd-logind[1901]: Removed session 17. Sep 4 01:00:39.400453 sshd[8366]: Accepted publickey for core from 147.75.109.163 port 58022 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:39.401062 sshd-session[8366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:39.403719 systemd-logind[1901]: New session 18 of user core. Sep 4 01:00:39.422378 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 01:00:39.510175 sshd[8368]: Connection closed by 147.75.109.163 port 58022 Sep 4 01:00:39.510374 sshd-session[8366]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:39.512449 systemd[1]: sshd@29-147.28.180.77:22-147.75.109.163:58022.service: Deactivated successfully. Sep 4 01:00:39.513441 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 01:00:39.513904 systemd-logind[1901]: Session 18 logged out. Waiting for processes to exit. Sep 4 01:00:39.514672 systemd-logind[1901]: Removed session 18. Sep 4 01:00:44.536245 systemd[1]: Started sshd@30-147.28.180.77:22-147.75.109.163:52306.service - OpenSSH per-connection server daemon (147.75.109.163:52306). Sep 4 01:00:44.580779 sshd[8393]: Accepted publickey for core from 147.75.109.163 port 52306 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:44.581555 sshd-session[8393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:44.584847 systemd-logind[1901]: New session 19 of user core. Sep 4 01:00:44.598360 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 01:00:44.687489 sshd[8395]: Connection closed by 147.75.109.163 port 52306 Sep 4 01:00:44.687682 sshd-session[8393]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:44.689692 systemd[1]: sshd@30-147.28.180.77:22-147.75.109.163:52306.service: Deactivated successfully. Sep 4 01:00:44.690790 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 01:00:44.691644 systemd-logind[1901]: Session 19 logged out. Waiting for processes to exit. Sep 4 01:00:44.692297 systemd-logind[1901]: Removed session 19. Sep 4 01:00:47.972769 containerd[1910]: time="2025-09-04T01:00:47.972737227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"b896d4e8877ca39a79d335deda4d57159031c979e334d4f4bf56f5071eb26d60\" pid:8436 exit_status:1 exited_at:{seconds:1756947647 nanos:972520476}" Sep 4 01:00:49.704106 systemd[1]: Started sshd@31-147.28.180.77:22-147.75.109.163:52318.service - OpenSSH per-connection server daemon (147.75.109.163:52318). Sep 4 01:00:49.720383 systemd[1]: Started sshd@32-147.28.180.77:22-159.203.2.69:45268.service - OpenSSH per-connection server daemon (159.203.2.69:45268). Sep 4 01:00:49.739804 sshd[8460]: Accepted publickey for core from 147.75.109.163 port 52318 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:49.740727 sshd-session[8460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:49.744042 systemd-logind[1901]: New session 20 of user core. Sep 4 01:00:49.744925 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 01:00:49.834900 sshd[8465]: Connection closed by 147.75.109.163 port 52318 Sep 4 01:00:49.835069 sshd-session[8460]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:49.836774 systemd[1]: sshd@31-147.28.180.77:22-147.75.109.163:52318.service: Deactivated successfully. Sep 4 01:00:49.837757 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 01:00:49.838472 systemd-logind[1901]: Session 20 logged out. Waiting for processes to exit. Sep 4 01:00:49.839159 systemd-logind[1901]: Removed session 20. Sep 4 01:00:50.090463 sshd[8463]: Invalid user www-data from 159.203.2.69 port 45268 Sep 4 01:00:50.148373 sshd[8463]: Received disconnect from 159.203.2.69 port 45268:11: Bye Bye [preauth] Sep 4 01:00:50.148373 sshd[8463]: Disconnected from invalid user www-data 159.203.2.69 port 45268 [preauth] Sep 4 01:00:50.150296 systemd[1]: sshd@32-147.28.180.77:22-159.203.2.69:45268.service: Deactivated successfully. Sep 4 01:00:51.069420 containerd[1910]: time="2025-09-04T01:00:51.069390685Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"0c34691177e32ce7ef4aa6364f6eda101e272139180b382e6533a8d368457ffd\" pid:8502 exited_at:{seconds:1756947651 nanos:69201052}" Sep 4 01:00:54.853537 systemd[1]: Started sshd@33-147.28.180.77:22-147.75.109.163:49370.service - OpenSSH per-connection server daemon (147.75.109.163:49370). Sep 4 01:00:54.885056 sshd[8526]: Accepted publickey for core from 147.75.109.163 port 49370 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:00:54.888441 sshd-session[8526]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:00:54.901496 systemd-logind[1901]: New session 21 of user core. Sep 4 01:00:54.915559 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 01:00:55.012755 sshd[8528]: Connection closed by 147.75.109.163 port 49370 Sep 4 01:00:55.012942 sshd-session[8526]: pam_unix(sshd:session): session closed for user core Sep 4 01:00:55.014661 systemd[1]: sshd@33-147.28.180.77:22-147.75.109.163:49370.service: Deactivated successfully. Sep 4 01:00:55.015637 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 01:00:55.016315 systemd-logind[1901]: Session 21 logged out. Waiting for processes to exit. Sep 4 01:00:55.016971 systemd-logind[1901]: Removed session 21. Sep 4 01:01:00.039274 systemd[1]: Started sshd@34-147.28.180.77:22-147.75.109.163:50262.service - OpenSSH per-connection server daemon (147.75.109.163:50262). Sep 4 01:01:00.093514 sshd[8553]: Accepted publickey for core from 147.75.109.163 port 50262 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:00.094395 sshd-session[8553]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:00.097831 systemd-logind[1901]: New session 22 of user core. Sep 4 01:01:00.104385 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 01:01:00.195059 sshd[8555]: Connection closed by 147.75.109.163 port 50262 Sep 4 01:01:00.195239 sshd-session[8553]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:00.212514 systemd[1]: sshd@34-147.28.180.77:22-147.75.109.163:50262.service: Deactivated successfully. Sep 4 01:01:00.216956 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 01:01:00.219339 systemd-logind[1901]: Session 22 logged out. Waiting for processes to exit. Sep 4 01:01:00.223700 systemd-logind[1901]: Removed session 22. Sep 4 01:01:00.227014 systemd[1]: Started sshd@35-147.28.180.77:22-147.75.109.163:50272.service - OpenSSH per-connection server daemon (147.75.109.163:50272). Sep 4 01:01:00.306856 sshd[8580]: Accepted publickey for core from 147.75.109.163 port 50272 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:00.307704 sshd-session[8580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:00.311190 systemd-logind[1901]: New session 23 of user core. Sep 4 01:01:00.323521 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 01:01:00.635999 sshd[8582]: Connection closed by 147.75.109.163 port 50272 Sep 4 01:01:00.636752 sshd-session[8580]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:00.666080 systemd[1]: sshd@35-147.28.180.77:22-147.75.109.163:50272.service: Deactivated successfully. Sep 4 01:01:00.670268 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 01:01:00.672580 systemd-logind[1901]: Session 23 logged out. Waiting for processes to exit. Sep 4 01:01:00.678762 systemd[1]: Started sshd@36-147.28.180.77:22-147.75.109.163:50276.service - OpenSSH per-connection server daemon (147.75.109.163:50276). Sep 4 01:01:00.680831 systemd-logind[1901]: Removed session 23. Sep 4 01:01:00.766369 sshd[8605]: Accepted publickey for core from 147.75.109.163 port 50276 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:00.767544 sshd-session[8605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:00.772206 systemd-logind[1901]: New session 24 of user core. Sep 4 01:01:00.789573 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 01:01:01.128903 containerd[1910]: time="2025-09-04T01:01:01.128856436Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ce9bda5befb6ad7d1c4a3d38dc2cc6085848919fc07a7b603fbc5a2a3ac7a80\" id:\"a6f32d85338becb71a0809821837d4fea6ab2e507033843daa7a988e38c5c542\" pid:8638 exited_at:{seconds:1756947661 nanos:128747784}" Sep 4 01:01:01.995964 sshd[8608]: Connection closed by 147.75.109.163 port 50276 Sep 4 01:01:01.996121 sshd-session[8605]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:02.004275 systemd[1]: sshd@36-147.28.180.77:22-147.75.109.163:50276.service: Deactivated successfully. Sep 4 01:01:02.005181 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 01:01:02.005284 systemd[1]: session-24.scope: Consumed 614ms CPU time, 78.7M memory peak. Sep 4 01:01:02.005575 systemd-logind[1901]: Session 24 logged out. Waiting for processes to exit. Sep 4 01:01:02.006637 systemd[1]: Started sshd@37-147.28.180.77:22-147.75.109.163:50292.service - OpenSSH per-connection server daemon (147.75.109.163:50292). Sep 4 01:01:02.007119 systemd-logind[1901]: Removed session 24. Sep 4 01:01:02.037396 sshd[8658]: Accepted publickey for core from 147.75.109.163 port 50292 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:02.038076 sshd-session[8658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:02.040810 systemd-logind[1901]: New session 25 of user core. Sep 4 01:01:02.058375 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 01:01:02.273286 sshd[8663]: Connection closed by 147.75.109.163 port 50292 Sep 4 01:01:02.273503 sshd-session[8658]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:02.293408 systemd[1]: sshd@37-147.28.180.77:22-147.75.109.163:50292.service: Deactivated successfully. Sep 4 01:01:02.295186 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 01:01:02.296191 systemd-logind[1901]: Session 25 logged out. Waiting for processes to exit. Sep 4 01:01:02.298927 systemd[1]: Started sshd@38-147.28.180.77:22-147.75.109.163:50298.service - OpenSSH per-connection server daemon (147.75.109.163:50298). Sep 4 01:01:02.299838 systemd-logind[1901]: Removed session 25. Sep 4 01:01:02.378408 sshd[8686]: Accepted publickey for core from 147.75.109.163 port 50298 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:02.379890 sshd-session[8686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:02.385747 systemd-logind[1901]: New session 26 of user core. Sep 4 01:01:02.396274 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 01:01:02.542109 sshd[8690]: Connection closed by 147.75.109.163 port 50298 Sep 4 01:01:02.542286 sshd-session[8686]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:02.544309 systemd[1]: sshd@38-147.28.180.77:22-147.75.109.163:50298.service: Deactivated successfully. Sep 4 01:01:02.545422 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 01:01:02.546198 systemd-logind[1901]: Session 26 logged out. Waiting for processes to exit. Sep 4 01:01:02.547000 systemd-logind[1901]: Removed session 26. Sep 4 01:01:05.994971 containerd[1910]: time="2025-09-04T01:01:05.994946386Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbb20fd0dbe93b7d8d81f058cd9ef9eb66e2d261b9ef0044de2db45e88a325c\" id:\"bbcc37891509365ae44bfd136e24cc5da4df66b07e1fe6269734021860741c2f\" pid:8732 exited_at:{seconds:1756947665 nanos:994766049}" Sep 4 01:01:07.572077 systemd[1]: Started sshd@39-147.28.180.77:22-147.75.109.163:50314.service - OpenSSH per-connection server daemon (147.75.109.163:50314). Sep 4 01:01:07.629360 sshd[8754]: Accepted publickey for core from 147.75.109.163 port 50314 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:07.632725 sshd-session[8754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:07.645428 systemd-logind[1901]: New session 27 of user core. Sep 4 01:01:07.665525 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 4 01:01:07.815734 sshd[8756]: Connection closed by 147.75.109.163 port 50314 Sep 4 01:01:07.815953 sshd-session[8754]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:07.818626 systemd[1]: sshd@39-147.28.180.77:22-147.75.109.163:50314.service: Deactivated successfully. Sep 4 01:01:07.819797 systemd[1]: session-27.scope: Deactivated successfully. Sep 4 01:01:07.820345 systemd-logind[1901]: Session 27 logged out. Waiting for processes to exit. Sep 4 01:01:07.821077 systemd-logind[1901]: Removed session 27. Sep 4 01:01:07.825201 systemd[1]: sshd@20-147.28.180.77:22-183.23.62.16:26849.service: Deactivated successfully. Sep 4 01:01:08.304366 systemd[1]: Started sshd@40-147.28.180.77:22-183.23.62.16:60501.service - OpenSSH per-connection server daemon (183.23.62.16:60501). Sep 4 01:01:12.842434 systemd[1]: Started sshd@41-147.28.180.77:22-147.75.109.163:59288.service - OpenSSH per-connection server daemon (147.75.109.163:59288). Sep 4 01:01:12.918116 sshd[8785]: Accepted publickey for core from 147.75.109.163 port 59288 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:12.918758 sshd-session[8785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:12.921234 systemd-logind[1901]: New session 28 of user core. Sep 4 01:01:12.932418 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 4 01:01:13.019978 sshd[8787]: Connection closed by 147.75.109.163 port 59288 Sep 4 01:01:13.020144 sshd-session[8785]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:13.021752 systemd[1]: sshd@41-147.28.180.77:22-147.75.109.163:59288.service: Deactivated successfully. Sep 4 01:01:13.022761 systemd[1]: session-28.scope: Deactivated successfully. Sep 4 01:01:13.023728 systemd-logind[1901]: Session 28 logged out. Waiting for processes to exit. Sep 4 01:01:13.024310 systemd-logind[1901]: Removed session 28. Sep 4 01:01:17.989092 containerd[1910]: time="2025-09-04T01:01:17.989036366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4254f4e464050ec109ca730b5649f7665df7aed39e330d8bbdb2cf6fa4e644ad\" id:\"04d8c1f542693d4aebb70fc225c08cedddd9d7904ac9b5640673655a5ef64c5c\" pid:8823 exited_at:{seconds:1756947677 nanos:988801383}" Sep 4 01:01:18.041638 systemd[1]: Started sshd@42-147.28.180.77:22-147.75.109.163:59300.service - OpenSSH per-connection server daemon (147.75.109.163:59300). Sep 4 01:01:18.137296 sshd[8848]: Accepted publickey for core from 147.75.109.163 port 59300 ssh2: RSA SHA256:YmcAm0Fk+vEYfpMhgN4+dwanIw0d08NPls5GDM5QrOM Sep 4 01:01:18.140324 sshd-session[8848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 01:01:18.152249 systemd-logind[1901]: New session 29 of user core. Sep 4 01:01:18.161571 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 4 01:01:18.275782 sshd[8850]: Connection closed by 147.75.109.163 port 59300 Sep 4 01:01:18.275897 sshd-session[8848]: pam_unix(sshd:session): session closed for user core Sep 4 01:01:18.277546 systemd[1]: sshd@42-147.28.180.77:22-147.75.109.163:59300.service: Deactivated successfully. Sep 4 01:01:18.278499 systemd[1]: session-29.scope: Deactivated successfully. Sep 4 01:01:18.279123 systemd-logind[1901]: Session 29 logged out. Waiting for processes to exit. Sep 4 01:01:18.279895 systemd-logind[1901]: Removed session 29.