Sep 12 22:28:43.770957 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 22:28:43.770977 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 20:38:46 -00 2025 Sep 12 22:28:43.770987 kernel: KASLR enabled Sep 12 22:28:43.770992 kernel: efi: EFI v2.7 by EDK II Sep 12 22:28:43.770998 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb21fd18 Sep 12 22:28:43.771003 kernel: random: crng init done Sep 12 22:28:43.771010 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 12 22:28:43.771016 kernel: secureboot: Secure boot enabled Sep 12 22:28:43.771022 kernel: ACPI: Early table checksum verification disabled Sep 12 22:28:43.771029 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 12 22:28:43.771035 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 12 22:28:43.771041 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771046 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771052 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771059 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771067 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771073 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771079 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771093 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771099 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:28:43.771105 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 12 22:28:43.771111 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 22:28:43.771117 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:28:43.771123 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 12 22:28:43.771129 kernel: Zone ranges: Sep 12 22:28:43.771137 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:28:43.771143 kernel: DMA32 empty Sep 12 22:28:43.771149 kernel: Normal empty Sep 12 22:28:43.771155 kernel: Device empty Sep 12 22:28:43.771161 kernel: Movable zone start for each node Sep 12 22:28:43.771167 kernel: Early memory node ranges Sep 12 22:28:43.771173 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 12 22:28:43.771179 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 12 22:28:43.771185 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 12 22:28:43.771191 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 12 22:28:43.771197 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 12 22:28:43.771203 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 12 22:28:43.771211 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 12 22:28:43.771216 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 12 22:28:43.771223 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 12 22:28:43.771232 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:28:43.771238 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 12 22:28:43.771245 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 12 22:28:43.771251 kernel: psci: probing for conduit method from ACPI. Sep 12 22:28:43.771259 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 22:28:43.771266 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 22:28:43.771272 kernel: psci: Trusted OS migration not required Sep 12 22:28:43.771279 kernel: psci: SMC Calling Convention v1.1 Sep 12 22:28:43.771285 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 22:28:43.771292 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 22:28:43.771298 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 22:28:43.771305 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 12 22:28:43.771311 kernel: Detected PIPT I-cache on CPU0 Sep 12 22:28:43.771319 kernel: CPU features: detected: GIC system register CPU interface Sep 12 22:28:43.771325 kernel: CPU features: detected: Spectre-v4 Sep 12 22:28:43.771331 kernel: CPU features: detected: Spectre-BHB Sep 12 22:28:43.771338 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 22:28:43.771344 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 22:28:43.771351 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 22:28:43.771357 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 22:28:43.771364 kernel: alternatives: applying boot alternatives Sep 12 22:28:43.771371 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:28:43.771378 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:28:43.771384 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:28:43.771392 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:28:43.771399 kernel: Fallback order for Node 0: 0 Sep 12 22:28:43.771405 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 12 22:28:43.771411 kernel: Policy zone: DMA Sep 12 22:28:43.771418 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:28:43.771424 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 12 22:28:43.771430 kernel: software IO TLB: area num 4. Sep 12 22:28:43.771437 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 12 22:28:43.771443 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 12 22:28:43.771450 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 22:28:43.771456 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:28:43.771463 kernel: rcu: RCU event tracing is enabled. Sep 12 22:28:43.771471 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 22:28:43.771477 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:28:43.771484 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:28:43.771490 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:28:43.771497 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 22:28:43.771503 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:28:43.771510 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:28:43.771516 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 22:28:43.771523 kernel: GICv3: 256 SPIs implemented Sep 12 22:28:43.771529 kernel: GICv3: 0 Extended SPIs implemented Sep 12 22:28:43.771536 kernel: Root IRQ handler: gic_handle_irq Sep 12 22:28:43.771543 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 22:28:43.771550 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 12 22:28:43.771556 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 22:28:43.771562 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 22:28:43.771569 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 12 22:28:43.771575 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 12 22:28:43.771582 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 12 22:28:43.771588 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 12 22:28:43.771594 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:28:43.771601 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:28:43.771607 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 22:28:43.771614 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 22:28:43.771621 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 22:28:43.771628 kernel: arm-pv: using stolen time PV Sep 12 22:28:43.771634 kernel: Console: colour dummy device 80x25 Sep 12 22:28:43.771641 kernel: ACPI: Core revision 20240827 Sep 12 22:28:43.771648 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 22:28:43.771654 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:28:43.771661 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:28:43.771667 kernel: landlock: Up and running. Sep 12 22:28:43.771734 kernel: SELinux: Initializing. Sep 12 22:28:43.771743 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:28:43.771750 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:28:43.771757 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:28:43.771764 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:28:43.771770 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:28:43.771777 kernel: Remapping and enabling EFI services. Sep 12 22:28:43.771783 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:28:43.771790 kernel: Detected PIPT I-cache on CPU1 Sep 12 22:28:43.771796 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 22:28:43.771805 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 12 22:28:43.771816 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:28:43.771823 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 22:28:43.771832 kernel: Detected PIPT I-cache on CPU2 Sep 12 22:28:43.771839 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 12 22:28:43.771846 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 12 22:28:43.771853 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:28:43.771860 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 12 22:28:43.771867 kernel: Detected PIPT I-cache on CPU3 Sep 12 22:28:43.771875 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 12 22:28:43.771882 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 12 22:28:43.771889 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:28:43.771896 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 12 22:28:43.771903 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 22:28:43.771910 kernel: SMP: Total of 4 processors activated. Sep 12 22:28:43.771917 kernel: CPU: All CPU(s) started at EL1 Sep 12 22:28:43.771924 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 22:28:43.771931 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 22:28:43.771939 kernel: CPU features: detected: Common not Private translations Sep 12 22:28:43.771946 kernel: CPU features: detected: CRC32 instructions Sep 12 22:28:43.771953 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 22:28:43.771960 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 22:28:43.771967 kernel: CPU features: detected: LSE atomic instructions Sep 12 22:28:43.771974 kernel: CPU features: detected: Privileged Access Never Sep 12 22:28:43.771981 kernel: CPU features: detected: RAS Extension Support Sep 12 22:28:43.771988 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 22:28:43.771995 kernel: alternatives: applying system-wide alternatives Sep 12 22:28:43.772005 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 12 22:28:43.772012 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 12 22:28:43.772019 kernel: devtmpfs: initialized Sep 12 22:28:43.772026 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:28:43.772033 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 22:28:43.772040 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 22:28:43.772047 kernel: 0 pages in range for non-PLT usage Sep 12 22:28:43.772054 kernel: 508560 pages in range for PLT usage Sep 12 22:28:43.772061 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:28:43.772069 kernel: SMBIOS 3.0.0 present. Sep 12 22:28:43.772076 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 12 22:28:43.772087 kernel: DMI: Memory slots populated: 1/1 Sep 12 22:28:43.772096 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:28:43.772103 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 22:28:43.772110 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 22:28:43.772117 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 22:28:43.772124 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:28:43.772131 kernel: audit: type=2000 audit(0.024:1): state=initialized audit_enabled=0 res=1 Sep 12 22:28:43.772140 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:28:43.772147 kernel: cpuidle: using governor menu Sep 12 22:28:43.772154 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 22:28:43.772161 kernel: ASID allocator initialised with 32768 entries Sep 12 22:28:43.772167 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:28:43.772174 kernel: Serial: AMBA PL011 UART driver Sep 12 22:28:43.772181 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:28:43.772188 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:28:43.772195 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 22:28:43.772204 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 22:28:43.772211 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:28:43.772218 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:28:43.772224 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 22:28:43.772231 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 22:28:43.772238 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:28:43.772245 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:28:43.772252 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:28:43.772258 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:28:43.772266 kernel: ACPI: Interpreter enabled Sep 12 22:28:43.772273 kernel: ACPI: Using GIC for interrupt routing Sep 12 22:28:43.772280 kernel: ACPI: MCFG table detected, 1 entries Sep 12 22:28:43.772287 kernel: ACPI: CPU0 has been hot-added Sep 12 22:28:43.772294 kernel: ACPI: CPU1 has been hot-added Sep 12 22:28:43.772301 kernel: ACPI: CPU2 has been hot-added Sep 12 22:28:43.772307 kernel: ACPI: CPU3 has been hot-added Sep 12 22:28:43.772314 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 22:28:43.772321 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 22:28:43.772330 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 22:28:43.772461 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:28:43.772525 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 22:28:43.772582 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 22:28:43.772639 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 22:28:43.772711 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 22:28:43.772721 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 22:28:43.772731 kernel: PCI host bridge to bus 0000:00 Sep 12 22:28:43.772797 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 22:28:43.772852 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 22:28:43.772904 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 22:28:43.772958 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 22:28:43.773036 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:28:43.773116 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 22:28:43.773181 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 12 22:28:43.773240 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 12 22:28:43.773299 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 22:28:43.773358 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 12 22:28:43.773419 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 12 22:28:43.773477 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 12 22:28:43.773531 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 22:28:43.773583 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 22:28:43.773634 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 22:28:43.773643 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 22:28:43.773650 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 22:28:43.773657 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 22:28:43.773664 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 22:28:43.773685 kernel: iommu: Default domain type: Translated Sep 12 22:28:43.773699 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 22:28:43.773708 kernel: efivars: Registered efivars operations Sep 12 22:28:43.773715 kernel: vgaarb: loaded Sep 12 22:28:43.773722 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 22:28:43.773729 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:28:43.773736 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:28:43.773742 kernel: pnp: PnP ACPI init Sep 12 22:28:43.773819 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 22:28:43.773829 kernel: pnp: PnP ACPI: found 1 devices Sep 12 22:28:43.773838 kernel: NET: Registered PF_INET protocol family Sep 12 22:28:43.773845 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:28:43.773852 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 22:28:43.773860 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:28:43.773867 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:28:43.773874 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 22:28:43.773880 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 22:28:43.773887 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:28:43.773894 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:28:43.773903 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:28:43.773910 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:28:43.773917 kernel: kvm [1]: HYP mode not available Sep 12 22:28:43.773924 kernel: Initialise system trusted keyrings Sep 12 22:28:43.773931 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 22:28:43.773938 kernel: Key type asymmetric registered Sep 12 22:28:43.773945 kernel: Asymmetric key parser 'x509' registered Sep 12 22:28:43.773952 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 22:28:43.773959 kernel: io scheduler mq-deadline registered Sep 12 22:28:43.773967 kernel: io scheduler kyber registered Sep 12 22:28:43.773974 kernel: io scheduler bfq registered Sep 12 22:28:43.773981 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 22:28:43.773988 kernel: ACPI: button: Power Button [PWRB] Sep 12 22:28:43.773996 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 22:28:43.774057 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 12 22:28:43.774066 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:28:43.774073 kernel: thunder_xcv, ver 1.0 Sep 12 22:28:43.774080 kernel: thunder_bgx, ver 1.0 Sep 12 22:28:43.774095 kernel: nicpf, ver 1.0 Sep 12 22:28:43.774102 kernel: nicvf, ver 1.0 Sep 12 22:28:43.774172 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 22:28:43.774231 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T22:28:43 UTC (1757716123) Sep 12 22:28:43.774241 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:28:43.774248 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 22:28:43.774256 kernel: watchdog: NMI not fully supported Sep 12 22:28:43.774263 kernel: watchdog: Hard watchdog permanently disabled Sep 12 22:28:43.774272 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:28:43.774279 kernel: Segment Routing with IPv6 Sep 12 22:28:43.774286 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:28:43.774293 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:28:43.774304 kernel: Key type dns_resolver registered Sep 12 22:28:43.774312 kernel: registered taskstats version 1 Sep 12 22:28:43.774318 kernel: Loading compiled-in X.509 certificates Sep 12 22:28:43.774326 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 2d7730e6d35b3fbd1c590cd72a2500b2380c020e' Sep 12 22:28:43.774333 kernel: Demotion targets for Node 0: null Sep 12 22:28:43.774341 kernel: Key type .fscrypt registered Sep 12 22:28:43.774348 kernel: Key type fscrypt-provisioning registered Sep 12 22:28:43.774355 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:28:43.774362 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:28:43.774369 kernel: ima: No architecture policies found Sep 12 22:28:43.774376 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 22:28:43.774383 kernel: clk: Disabling unused clocks Sep 12 22:28:43.774389 kernel: PM: genpd: Disabling unused power domains Sep 12 22:28:43.774396 kernel: Warning: unable to open an initial console. Sep 12 22:28:43.774405 kernel: Freeing unused kernel memory: 38976K Sep 12 22:28:43.774412 kernel: Run /init as init process Sep 12 22:28:43.774419 kernel: with arguments: Sep 12 22:28:43.774425 kernel: /init Sep 12 22:28:43.774432 kernel: with environment: Sep 12 22:28:43.774439 kernel: HOME=/ Sep 12 22:28:43.774446 kernel: TERM=linux Sep 12 22:28:43.774452 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:28:43.774460 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:28:43.774471 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:28:43.774479 systemd[1]: Detected virtualization kvm. Sep 12 22:28:43.774487 systemd[1]: Detected architecture arm64. Sep 12 22:28:43.774494 systemd[1]: Running in initrd. Sep 12 22:28:43.774501 systemd[1]: No hostname configured, using default hostname. Sep 12 22:28:43.774509 systemd[1]: Hostname set to . Sep 12 22:28:43.774516 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:28:43.774525 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:28:43.774532 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:28:43.774540 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:28:43.774548 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:28:43.774555 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:28:43.774563 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:28:43.774571 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:28:43.774581 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:28:43.774589 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:28:43.774596 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:28:43.774604 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:28:43.774611 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:28:43.774618 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:28:43.774626 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:28:43.774633 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:28:43.774642 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:28:43.774650 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:28:43.774657 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:28:43.774665 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:28:43.774683 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:28:43.774691 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:28:43.774699 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:28:43.774706 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:28:43.774714 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:28:43.774723 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:28:43.774731 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:28:43.774739 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:28:43.774747 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:28:43.774754 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:28:43.774762 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:28:43.774770 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:28:43.774777 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:28:43.774787 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:28:43.774795 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:28:43.774802 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:28:43.774826 systemd-journald[243]: Collecting audit messages is disabled. Sep 12 22:28:43.774847 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:28:43.774856 systemd-journald[243]: Journal started Sep 12 22:28:43.774875 systemd-journald[243]: Runtime Journal (/run/log/journal/be729ef74a6a4c7c9cd18e23a7957fe6) is 6M, max 48.5M, 42.4M free. Sep 12 22:28:43.763473 systemd-modules-load[245]: Inserted module 'overlay' Sep 12 22:28:43.779697 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:28:43.779723 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:28:43.781361 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:28:43.784172 kernel: Bridge firewalling registered Sep 12 22:28:43.781597 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 12 22:28:43.783048 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:28:43.788194 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:28:43.790318 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:28:43.792317 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:28:43.804290 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:28:43.811344 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:28:43.811916 systemd-tmpfiles[266]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:28:43.812851 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:28:43.817292 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:28:43.820231 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:28:43.838812 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:28:43.840915 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:28:43.855091 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:28:43.869409 systemd-resolved[280]: Positive Trust Anchors: Sep 12 22:28:43.869427 systemd-resolved[280]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:28:43.869458 systemd-resolved[280]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:28:43.874218 systemd-resolved[280]: Defaulting to hostname 'linux'. Sep 12 22:28:43.875120 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:28:43.879732 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:28:43.926701 kernel: SCSI subsystem initialized Sep 12 22:28:43.931690 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:28:43.938703 kernel: iscsi: registered transport (tcp) Sep 12 22:28:43.951776 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:28:43.951792 kernel: QLogic iSCSI HBA Driver Sep 12 22:28:43.969031 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:28:43.987126 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:28:43.989915 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:28:44.032766 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:28:44.035057 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:28:44.088711 kernel: raid6: neonx8 gen() 15767 MB/s Sep 12 22:28:44.105702 kernel: raid6: neonx4 gen() 15773 MB/s Sep 12 22:28:44.122700 kernel: raid6: neonx2 gen() 13240 MB/s Sep 12 22:28:44.139699 kernel: raid6: neonx1 gen() 10426 MB/s Sep 12 22:28:44.156699 kernel: raid6: int64x8 gen() 6886 MB/s Sep 12 22:28:44.173698 kernel: raid6: int64x4 gen() 7349 MB/s Sep 12 22:28:44.190700 kernel: raid6: int64x2 gen() 6104 MB/s Sep 12 22:28:44.207829 kernel: raid6: int64x1 gen() 5047 MB/s Sep 12 22:28:44.207843 kernel: raid6: using algorithm neonx4 gen() 15773 MB/s Sep 12 22:28:44.225776 kernel: raid6: .... xor() 12314 MB/s, rmw enabled Sep 12 22:28:44.225792 kernel: raid6: using neon recovery algorithm Sep 12 22:28:44.230698 kernel: xor: measuring software checksum speed Sep 12 22:28:44.232084 kernel: 8regs : 18544 MB/sec Sep 12 22:28:44.232100 kernel: 32regs : 21636 MB/sec Sep 12 22:28:44.232719 kernel: arm64_neon : 26331 MB/sec Sep 12 22:28:44.232731 kernel: xor: using function: arm64_neon (26331 MB/sec) Sep 12 22:28:44.285705 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:28:44.291875 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:28:44.294393 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:28:44.322551 systemd-udevd[496]: Using default interface naming scheme 'v255'. Sep 12 22:28:44.327234 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:28:44.329255 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:28:44.360563 dracut-pre-trigger[503]: rd.md=0: removing MD RAID activation Sep 12 22:28:44.382709 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:28:44.384922 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:28:44.435527 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:28:44.438122 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:28:44.486595 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 12 22:28:44.488761 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 22:28:44.496827 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:28:44.496951 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:28:44.500472 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:28:44.508532 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:28:44.508559 kernel: GPT:9289727 != 19775487 Sep 12 22:28:44.508569 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:28:44.508577 kernel: GPT:9289727 != 19775487 Sep 12 22:28:44.508585 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:28:44.508594 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:28:44.502540 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:28:44.534023 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:28:44.547417 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 22:28:44.548927 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:28:44.557464 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 22:28:44.565600 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:28:44.571896 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 22:28:44.573120 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 22:28:44.575415 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:28:44.578407 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:28:44.580562 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:28:44.583257 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:28:44.585134 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:28:44.604052 disk-uuid[591]: Primary Header is updated. Sep 12 22:28:44.604052 disk-uuid[591]: Secondary Entries is updated. Sep 12 22:28:44.604052 disk-uuid[591]: Secondary Header is updated. Sep 12 22:28:44.607368 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:28:44.610234 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:28:45.615607 disk-uuid[594]: The operation has completed successfully. Sep 12 22:28:45.616770 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:28:45.639219 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:28:45.639319 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:28:45.665634 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:28:45.687317 sh[610]: Success Sep 12 22:28:45.700475 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:28:45.700518 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:28:45.700531 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:28:45.706687 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 22:28:45.728739 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:28:45.730901 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:28:45.743709 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:28:45.749203 kernel: BTRFS: device fsid 254e43f1-b609-42b8-bcc5-437252095415 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (622) Sep 12 22:28:45.749235 kernel: BTRFS info (device dm-0): first mount of filesystem 254e43f1-b609-42b8-bcc5-437252095415 Sep 12 22:28:45.749245 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:28:45.754039 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:28:45.754065 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:28:45.754984 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:28:45.756166 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:28:45.757653 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:28:45.758300 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:28:45.760036 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:28:45.786685 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (653) Sep 12 22:28:45.788875 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:28:45.788912 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:28:45.791352 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:28:45.791393 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:28:45.796694 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:28:45.796823 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:28:45.798492 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:28:45.866652 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:28:45.869584 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:28:45.900172 ignition[700]: Ignition 2.22.0 Sep 12 22:28:45.900187 ignition[700]: Stage: fetch-offline Sep 12 22:28:45.900214 ignition[700]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:45.900222 ignition[700]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:45.900309 ignition[700]: parsed url from cmdline: "" Sep 12 22:28:45.900312 ignition[700]: no config URL provided Sep 12 22:28:45.900316 ignition[700]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:28:45.900322 ignition[700]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:28:45.900340 ignition[700]: op(1): [started] loading QEMU firmware config module Sep 12 22:28:45.900349 ignition[700]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 22:28:45.906070 ignition[700]: op(1): [finished] loading QEMU firmware config module Sep 12 22:28:45.910910 systemd-networkd[808]: lo: Link UP Sep 12 22:28:45.910922 systemd-networkd[808]: lo: Gained carrier Sep 12 22:28:45.911585 systemd-networkd[808]: Enumeration completed Sep 12 22:28:45.911827 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:28:45.911962 systemd-networkd[808]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:28:45.911966 systemd-networkd[808]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:28:45.912356 systemd-networkd[808]: eth0: Link UP Sep 12 22:28:45.912650 systemd-networkd[808]: eth0: Gained carrier Sep 12 22:28:45.912660 systemd-networkd[808]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:28:45.914306 systemd[1]: Reached target network.target - Network. Sep 12 22:28:45.935707 systemd-networkd[808]: eth0: DHCPv4 address 10.0.0.148/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:28:45.959091 ignition[700]: parsing config with SHA512: 5b8dbc19ce0318fc9138d1f75ecf40181c6068c7aa6a5096605032be67fae5a777db153954bed408075e892fd8e2c4293f53a025d28fbfa28be22424a4524e04 Sep 12 22:28:45.963088 unknown[700]: fetched base config from "system" Sep 12 22:28:45.963102 unknown[700]: fetched user config from "qemu" Sep 12 22:28:45.963485 ignition[700]: fetch-offline: fetch-offline passed Sep 12 22:28:45.963789 ignition[700]: Ignition finished successfully Sep 12 22:28:45.965544 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:28:45.967470 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 22:28:45.968182 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:28:46.002336 ignition[818]: Ignition 2.22.0 Sep 12 22:28:46.002352 ignition[818]: Stage: kargs Sep 12 22:28:46.002484 ignition[818]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:46.002493 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:46.003235 ignition[818]: kargs: kargs passed Sep 12 22:28:46.006186 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:28:46.003278 ignition[818]: Ignition finished successfully Sep 12 22:28:46.008111 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:28:46.036970 ignition[826]: Ignition 2.22.0 Sep 12 22:28:46.036990 ignition[826]: Stage: disks Sep 12 22:28:46.037141 ignition[826]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:46.037150 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:46.037886 ignition[826]: disks: disks passed Sep 12 22:28:46.041015 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:28:46.037927 ignition[826]: Ignition finished successfully Sep 12 22:28:46.042358 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:28:46.044065 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:28:46.045846 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:28:46.047602 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:28:46.049654 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:28:46.052221 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:28:46.074448 systemd-fsck[837]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 22:28:46.078470 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:28:46.081407 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:28:46.149697 kernel: EXT4-fs (vda9): mounted filesystem a7b592ec-3c41-4dc2-88a7-056c1f18b418 r/w with ordered data mode. Quota mode: none. Sep 12 22:28:46.150565 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:28:46.151872 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:28:46.154205 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:28:46.155765 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:28:46.156702 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 22:28:46.156739 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:28:46.156761 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:28:46.171125 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:28:46.174254 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:28:46.178781 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (845) Sep 12 22:28:46.178800 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:28:46.178815 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:28:46.182454 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:28:46.182487 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:28:46.183797 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:28:46.207567 initrd-setup-root[869]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:28:46.210628 initrd-setup-root[876]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:28:46.213479 initrd-setup-root[883]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:28:46.217351 initrd-setup-root[890]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:28:46.278695 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:28:46.280599 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:28:46.282132 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:28:46.298713 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:28:46.306899 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:28:46.318272 ignition[959]: INFO : Ignition 2.22.0 Sep 12 22:28:46.318272 ignition[959]: INFO : Stage: mount Sep 12 22:28:46.319794 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:46.319794 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:46.319794 ignition[959]: INFO : mount: mount passed Sep 12 22:28:46.319794 ignition[959]: INFO : Ignition finished successfully Sep 12 22:28:46.322264 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:28:46.324163 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:28:46.755916 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:28:46.757386 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:28:46.776143 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (971) Sep 12 22:28:46.776170 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:28:46.777113 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:28:46.779686 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:28:46.779700 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:28:46.780909 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:28:46.810461 ignition[988]: INFO : Ignition 2.22.0 Sep 12 22:28:46.810461 ignition[988]: INFO : Stage: files Sep 12 22:28:46.812044 ignition[988]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:46.812044 ignition[988]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:46.812044 ignition[988]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:28:46.815796 ignition[988]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:28:46.815796 ignition[988]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:28:46.815796 ignition[988]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:28:46.815796 ignition[988]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:28:46.821970 ignition[988]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:28:46.817148 unknown[988]: wrote ssh authorized keys file for user: core Sep 12 22:28:46.824398 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:28:46.824398 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 12 22:28:46.882588 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:28:47.241272 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:28:47.241272 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:28:47.245521 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:28:47.264585 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:28:47.264585 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 12 22:28:47.559852 systemd-networkd[808]: eth0: Gained IPv6LL Sep 12 22:28:47.701484 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:28:48.024629 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:28:48.024629 ignition[988]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 22:28:48.028819 ignition[988]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 22:28:48.042651 ignition[988]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:28:48.044086 ignition[988]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:28:48.044086 ignition[988]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 22:28:48.044086 ignition[988]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:28:48.044086 ignition[988]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:28:48.044086 ignition[988]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:28:48.044086 ignition[988]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:28:48.044086 ignition[988]: INFO : files: files passed Sep 12 22:28:48.044086 ignition[988]: INFO : Ignition finished successfully Sep 12 22:28:48.045506 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:28:48.047602 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:28:48.049782 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:28:48.067555 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:28:48.068625 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:28:48.070734 initrd-setup-root-after-ignition[1016]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 22:28:48.071994 initrd-setup-root-after-ignition[1019]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:28:48.071994 initrd-setup-root-after-ignition[1019]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:28:48.075546 initrd-setup-root-after-ignition[1023]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:28:48.075360 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:28:48.076873 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:28:48.080403 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:28:48.128235 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:28:48.128347 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:28:48.130539 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:28:48.132465 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:28:48.134303 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:28:48.134973 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:28:48.159254 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:28:48.161449 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:28:48.184194 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:28:48.185497 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:28:48.187819 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:28:48.189698 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:28:48.189812 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:28:48.192438 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:28:48.194464 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:28:48.196132 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:28:48.197847 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:28:48.199920 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:28:48.201840 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:28:48.203769 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:28:48.205760 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:28:48.207781 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:28:48.209765 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:28:48.211550 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:28:48.213065 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:28:48.213192 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:28:48.215489 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:28:48.216736 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:28:48.218729 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:28:48.218833 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:28:48.220895 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:28:48.220996 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:28:48.223590 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:28:48.223712 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:28:48.225976 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:28:48.227462 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:28:48.227568 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:28:48.229517 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:28:48.231271 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:28:48.233008 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:28:48.233102 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:28:48.234594 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:28:48.234682 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:28:48.236487 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:28:48.236592 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:28:48.239050 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:28:48.239160 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:28:48.241397 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:28:48.242304 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:28:48.242447 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:28:48.244834 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:28:48.245647 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:28:48.245800 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:28:48.247626 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:28:48.247745 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:28:48.252997 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:28:48.254791 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:28:48.262653 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:28:48.270260 ignition[1044]: INFO : Ignition 2.22.0 Sep 12 22:28:48.270260 ignition[1044]: INFO : Stage: umount Sep 12 22:28:48.271915 ignition[1044]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:28:48.271915 ignition[1044]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:28:48.271915 ignition[1044]: INFO : umount: umount passed Sep 12 22:28:48.271915 ignition[1044]: INFO : Ignition finished successfully Sep 12 22:28:48.273333 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:28:48.273462 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:28:48.276042 systemd[1]: Stopped target network.target - Network. Sep 12 22:28:48.277382 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:28:48.277445 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:28:48.279125 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:28:48.279173 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:28:48.280782 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:28:48.280830 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:28:48.282529 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:28:48.282571 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:28:48.284329 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:28:48.286049 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:28:48.290178 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:28:48.290286 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:28:48.293669 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:28:48.293916 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:28:48.294014 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:28:48.297386 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:28:48.297897 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:28:48.299813 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:28:48.299849 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:28:48.302572 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:28:48.303668 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:28:48.303759 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:28:48.305907 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:28:48.305955 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:28:48.308818 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:28:48.308886 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:28:48.310759 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:28:48.310803 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:28:48.314017 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:28:48.320917 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:28:48.320977 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:28:48.327647 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:28:48.333828 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:28:48.335017 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:28:48.335135 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:28:48.337197 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:28:48.337264 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:28:48.338697 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:28:48.338728 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:28:48.340887 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:28:48.340937 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:28:48.343662 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:28:48.343733 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:28:48.346525 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:28:48.346575 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:28:48.349519 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:28:48.349569 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:28:48.352168 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:28:48.353240 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:28:48.353301 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:28:48.356479 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:28:48.356522 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:28:48.359561 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:28:48.359606 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:28:48.363708 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 22:28:48.363759 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 22:28:48.363792 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:28:48.364016 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:28:48.374804 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:28:48.379594 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:28:48.379701 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:28:48.382023 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:28:48.384420 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:28:48.414506 systemd[1]: Switching root. Sep 12 22:28:48.453148 systemd-journald[243]: Journal stopped Sep 12 22:28:49.197273 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Sep 12 22:28:49.197324 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:28:49.197337 kernel: SELinux: policy capability open_perms=1 Sep 12 22:28:49.197347 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:28:49.197356 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:28:49.197371 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:28:49.197382 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:28:49.197391 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:28:49.197400 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:28:49.197413 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:28:49.197422 kernel: audit: type=1403 audit(1757716128.622:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:28:49.197436 systemd[1]: Successfully loaded SELinux policy in 60.746ms. Sep 12 22:28:49.197456 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.189ms. Sep 12 22:28:49.197468 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:28:49.197479 systemd[1]: Detected virtualization kvm. Sep 12 22:28:49.197489 systemd[1]: Detected architecture arm64. Sep 12 22:28:49.197499 systemd[1]: Detected first boot. Sep 12 22:28:49.197510 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:28:49.197521 zram_generator::config[1090]: No configuration found. Sep 12 22:28:49.197533 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:28:49.197547 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:28:49.197558 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:28:49.197568 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:28:49.197578 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:28:49.197589 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:28:49.197599 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:28:49.197610 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:28:49.197621 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:28:49.197631 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:28:49.197643 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:28:49.197653 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:28:49.197664 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:28:49.197698 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:28:49.197709 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:28:49.197720 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:28:49.197730 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:28:49.197741 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:28:49.197754 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:28:49.197764 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:28:49.197774 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 22:28:49.197784 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:28:49.197795 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:28:49.197805 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:28:49.197816 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:28:49.197826 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:28:49.197837 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:28:49.197848 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:28:49.197858 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:28:49.197869 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:28:49.197879 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:28:49.197889 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:28:49.197899 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:28:49.197910 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:28:49.197920 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:28:49.197932 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:28:49.197942 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:28:49.197952 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:28:49.197963 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:28:49.197973 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:28:49.197983 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:28:49.197993 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:28:49.198004 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:28:49.198014 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:28:49.198025 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:28:49.198036 systemd[1]: Reached target machines.target - Containers. Sep 12 22:28:49.198048 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:28:49.198058 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:28:49.198075 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:28:49.198087 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:28:49.198097 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:28:49.198107 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:28:49.198117 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:28:49.198129 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:28:49.198139 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:28:49.198150 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:28:49.198160 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:28:49.198170 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:28:49.198180 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:28:49.198191 kernel: fuse: init (API version 7.41) Sep 12 22:28:49.198200 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:28:49.198211 kernel: loop: module loaded Sep 12 22:28:49.198222 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:28:49.198232 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:28:49.198242 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:28:49.198252 kernel: ACPI: bus type drm_connector registered Sep 12 22:28:49.198262 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:28:49.198273 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:28:49.198283 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:28:49.198294 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:28:49.198305 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:28:49.198316 systemd[1]: Stopped verity-setup.service. Sep 12 22:28:49.198326 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:28:49.198336 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:28:49.198346 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:28:49.198377 systemd-journald[1165]: Collecting audit messages is disabled. Sep 12 22:28:49.198400 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:28:49.198411 systemd-journald[1165]: Journal started Sep 12 22:28:49.198431 systemd-journald[1165]: Runtime Journal (/run/log/journal/be729ef74a6a4c7c9cd18e23a7957fe6) is 6M, max 48.5M, 42.4M free. Sep 12 22:28:48.976727 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:28:48.996659 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 22:28:48.997077 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:28:49.200386 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:28:49.201003 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:28:49.202253 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:28:49.204703 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:28:49.206116 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:28:49.207620 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:28:49.207801 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:28:49.209208 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:28:49.209358 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:28:49.210789 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:28:49.210935 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:28:49.212231 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:28:49.212371 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:28:49.213874 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:28:49.214037 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:28:49.215507 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:28:49.215665 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:28:49.217047 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:28:49.218644 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:28:49.220313 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:28:49.221921 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:28:49.233162 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:28:49.236175 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:28:49.238421 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:28:49.240494 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:28:49.241843 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:28:49.241880 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:28:49.243745 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:28:49.251422 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:28:49.252654 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:28:49.253649 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:28:49.255527 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:28:49.256886 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:28:49.258813 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:28:49.260057 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:28:49.265493 systemd-journald[1165]: Time spent on flushing to /var/log/journal/be729ef74a6a4c7c9cd18e23a7957fe6 is 21.263ms for 883 entries. Sep 12 22:28:49.265493 systemd-journald[1165]: System Journal (/var/log/journal/be729ef74a6a4c7c9cd18e23a7957fe6) is 8M, max 195.6M, 187.6M free. Sep 12 22:28:49.291393 systemd-journald[1165]: Received client request to flush runtime journal. Sep 12 22:28:49.291443 kernel: loop0: detected capacity change from 0 to 203944 Sep 12 22:28:49.262816 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:28:49.267773 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:28:49.271874 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:28:49.275307 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:28:49.276657 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:28:49.282817 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:28:49.284508 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:28:49.292373 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:28:49.300725 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:28:49.301812 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:28:49.304336 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:28:49.312923 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:28:49.313534 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:28:49.315699 kernel: loop1: detected capacity change from 0 to 119368 Sep 12 22:28:49.319733 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:28:49.322350 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:28:49.354697 kernel: loop2: detected capacity change from 0 to 100632 Sep 12 22:28:49.356744 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 12 22:28:49.357009 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 12 22:28:49.360550 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:28:49.382774 kernel: loop3: detected capacity change from 0 to 203944 Sep 12 22:28:49.388953 kernel: loop4: detected capacity change from 0 to 119368 Sep 12 22:28:49.394693 kernel: loop5: detected capacity change from 0 to 100632 Sep 12 22:28:49.398867 (sd-merge)[1229]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 22:28:49.399311 (sd-merge)[1229]: Merged extensions into '/usr'. Sep 12 22:28:49.404170 systemd[1]: Reload requested from client PID 1207 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:28:49.404190 systemd[1]: Reloading... Sep 12 22:28:49.460384 zram_generator::config[1255]: No configuration found. Sep 12 22:28:49.509262 ldconfig[1202]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:28:49.598377 systemd[1]: Reloading finished in 193 ms. Sep 12 22:28:49.622241 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:28:49.623878 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:28:49.644060 systemd[1]: Starting ensure-sysext.service... Sep 12 22:28:49.647814 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:28:49.655880 systemd[1]: Reload requested from client PID 1289 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:28:49.655896 systemd[1]: Reloading... Sep 12 22:28:49.661025 systemd-tmpfiles[1290]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:28:49.661073 systemd-tmpfiles[1290]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:28:49.661301 systemd-tmpfiles[1290]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:28:49.661483 systemd-tmpfiles[1290]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:28:49.662123 systemd-tmpfiles[1290]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:28:49.662334 systemd-tmpfiles[1290]: ACLs are not supported, ignoring. Sep 12 22:28:49.662380 systemd-tmpfiles[1290]: ACLs are not supported, ignoring. Sep 12 22:28:49.665234 systemd-tmpfiles[1290]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:28:49.665247 systemd-tmpfiles[1290]: Skipping /boot Sep 12 22:28:49.671300 systemd-tmpfiles[1290]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:28:49.671315 systemd-tmpfiles[1290]: Skipping /boot Sep 12 22:28:49.708719 zram_generator::config[1317]: No configuration found. Sep 12 22:28:49.835092 systemd[1]: Reloading finished in 178 ms. Sep 12 22:28:49.855041 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:28:49.861714 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:28:49.874635 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:28:49.876957 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:28:49.879159 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:28:49.881855 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:28:49.886136 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:28:49.889651 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:28:49.897301 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:28:49.900524 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:28:49.901600 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:28:49.905942 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:28:49.908471 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:28:49.909617 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:28:49.909739 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:28:49.912728 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:28:49.914516 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:28:49.914762 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:28:49.921315 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:28:49.921481 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:28:49.925792 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:28:49.926032 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:28:49.927121 systemd-udevd[1358]: Using default interface naming scheme 'v255'. Sep 12 22:28:49.931210 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:28:49.933262 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:28:49.938177 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:28:49.939374 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:28:49.939990 augenrules[1388]: No rules Sep 12 22:28:49.943807 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:28:49.955249 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:28:49.959704 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:28:49.960767 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:28:49.960813 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:28:49.962760 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:28:49.963808 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:28:49.964339 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:28:49.966217 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:28:49.969447 systemd[1]: Finished ensure-sysext.service. Sep 12 22:28:49.970576 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:28:49.971371 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:28:49.973199 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:28:49.973357 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:28:49.975657 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:28:49.975820 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:28:49.977090 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:28:49.977229 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:28:49.980136 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:28:50.000720 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:28:50.002276 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:28:50.004461 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:28:50.008128 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:28:50.008857 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:28:50.011042 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 22:28:50.011795 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:28:50.066472 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:28:50.069345 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:28:50.098664 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:28:50.115485 systemd-networkd[1437]: lo: Link UP Sep 12 22:28:50.115489 systemd-networkd[1437]: lo: Gained carrier Sep 12 22:28:50.116292 systemd-networkd[1437]: Enumeration completed Sep 12 22:28:50.116402 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:28:50.116709 systemd-networkd[1437]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:28:50.116718 systemd-networkd[1437]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:28:50.117211 systemd-networkd[1437]: eth0: Link UP Sep 12 22:28:50.117314 systemd-networkd[1437]: eth0: Gained carrier Sep 12 22:28:50.117331 systemd-networkd[1437]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:28:50.121131 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:28:50.123483 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:28:50.130731 systemd-networkd[1437]: eth0: DHCPv4 address 10.0.0.148/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:28:50.134345 systemd-resolved[1356]: Positive Trust Anchors: Sep 12 22:28:50.134361 systemd-resolved[1356]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:28:50.134392 systemd-resolved[1356]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:28:50.141385 systemd-resolved[1356]: Defaulting to hostname 'linux'. Sep 12 22:28:50.142982 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:28:50.145105 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:28:50.147937 systemd[1]: Reached target network.target - Network. Sep 12 22:28:50.149804 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:28:50.164003 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:28:50.165486 systemd-timesyncd[1439]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 22:28:50.165534 systemd-timesyncd[1439]: Initial clock synchronization to Fri 2025-09-12 22:28:49.864149 UTC. Sep 12 22:28:50.165581 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:28:50.166775 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:28:50.168844 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:28:50.170122 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:28:50.171378 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:28:50.171414 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:28:50.172361 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:28:50.173521 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:28:50.175098 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:28:50.176487 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:28:50.178453 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:28:50.181335 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:28:50.184870 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:28:50.186447 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:28:50.187834 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:28:50.193353 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:28:50.194727 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:28:50.196370 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:28:50.203703 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:28:50.204657 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:28:50.205619 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:28:50.205650 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:28:50.206490 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:28:50.208459 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:28:50.222202 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:28:50.224424 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:28:50.226615 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:28:50.227690 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:28:50.228661 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:28:50.230949 jq[1474]: false Sep 12 22:28:50.232779 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:28:50.234879 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:28:50.236999 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:28:50.241279 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:28:50.242114 extend-filesystems[1475]: Found /dev/vda6 Sep 12 22:28:50.243735 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:28:50.245801 extend-filesystems[1475]: Found /dev/vda9 Sep 12 22:28:50.245647 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:28:50.246041 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:28:50.247357 extend-filesystems[1475]: Checking size of /dev/vda9 Sep 12 22:28:50.248907 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:28:50.250911 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:28:50.261684 extend-filesystems[1475]: Resized partition /dev/vda9 Sep 12 22:28:50.262858 extend-filesystems[1504]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:28:50.263981 jq[1498]: true Sep 12 22:28:50.265863 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:28:50.267904 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:28:50.268136 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:28:50.268516 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:28:50.270025 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 22:28:50.268896 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:28:50.278266 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:28:50.278522 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:28:50.305692 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 22:28:50.305761 jq[1507]: true Sep 12 22:28:50.307346 (ntainerd)[1508]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:28:50.322998 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:28:50.322795 dbus-daemon[1472]: [system] SELinux support is enabled Sep 12 22:28:50.324412 update_engine[1494]: I20250912 22:28:50.316648 1494 main.cc:92] Flatcar Update Engine starting Sep 12 22:28:50.324558 tar[1506]: linux-arm64/helm Sep 12 22:28:50.324668 extend-filesystems[1504]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 22:28:50.324668 extend-filesystems[1504]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 22:28:50.324668 extend-filesystems[1504]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 22:28:50.337913 extend-filesystems[1475]: Resized filesystem in /dev/vda9 Sep 12 22:28:50.341829 update_engine[1494]: I20250912 22:28:50.331637 1494 update_check_scheduler.cc:74] Next update check in 4m16s Sep 12 22:28:50.326166 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:28:50.326369 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:28:50.334274 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:28:50.340247 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:28:50.340285 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:28:50.341611 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:28:50.341627 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:28:50.343120 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:28:50.346277 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:28:50.346923 systemd-logind[1485]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 22:28:50.347128 systemd-logind[1485]: New seat seat0. Sep 12 22:28:50.348218 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:28:50.353217 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:28:50.354764 bash[1537]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:28:50.355372 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 22:28:50.405125 locksmithd[1540]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:28:50.474226 containerd[1508]: time="2025-09-12T22:28:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:28:50.474863 containerd[1508]: time="2025-09-12T22:28:50.474827760Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:28:50.484242 containerd[1508]: time="2025-09-12T22:28:50.484203480Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.4µs" Sep 12 22:28:50.484242 containerd[1508]: time="2025-09-12T22:28:50.484237120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:28:50.484319 containerd[1508]: time="2025-09-12T22:28:50.484254880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:28:50.484407 containerd[1508]: time="2025-09-12T22:28:50.484386360Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:28:50.484436 containerd[1508]: time="2025-09-12T22:28:50.484406920Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:28:50.484474 containerd[1508]: time="2025-09-12T22:28:50.484435360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484496 containerd[1508]: time="2025-09-12T22:28:50.484483320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484515 containerd[1508]: time="2025-09-12T22:28:50.484495120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484726 containerd[1508]: time="2025-09-12T22:28:50.484703360Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484726 containerd[1508]: time="2025-09-12T22:28:50.484724120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484779 containerd[1508]: time="2025-09-12T22:28:50.484735880Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484779 containerd[1508]: time="2025-09-12T22:28:50.484743800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:28:50.484870 containerd[1508]: time="2025-09-12T22:28:50.484835520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:28:50.485059 containerd[1508]: time="2025-09-12T22:28:50.485038200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:28:50.485107 containerd[1508]: time="2025-09-12T22:28:50.485081800Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:28:50.485107 containerd[1508]: time="2025-09-12T22:28:50.485093840Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:28:50.485149 containerd[1508]: time="2025-09-12T22:28:50.485114480Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:28:50.485406 containerd[1508]: time="2025-09-12T22:28:50.485308880Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:28:50.485406 containerd[1508]: time="2025-09-12T22:28:50.485371960Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:28:50.488894 containerd[1508]: time="2025-09-12T22:28:50.488862400Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:28:50.488964 containerd[1508]: time="2025-09-12T22:28:50.488921920Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:28:50.488964 containerd[1508]: time="2025-09-12T22:28:50.488936200Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:28:50.488964 containerd[1508]: time="2025-09-12T22:28:50.488947200Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:28:50.488964 containerd[1508]: time="2025-09-12T22:28:50.488957960Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.488996600Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489012800Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489024560Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489041760Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489051760Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489068560Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489084520Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489191440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489210880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489225680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489237280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489246920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489256880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489268600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:28:50.489720 containerd[1508]: time="2025-09-12T22:28:50.489280640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489296760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489306840Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489316520Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489492600Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489511200Z" level=info msg="Start snapshots syncer" Sep 12 22:28:50.489977 containerd[1508]: time="2025-09-12T22:28:50.489545320Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:28:50.490086 containerd[1508]: time="2025-09-12T22:28:50.489876520Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:28:50.490086 containerd[1508]: time="2025-09-12T22:28:50.489922560Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.489982040Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490092560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490115760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490127240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490139600Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490153520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490163960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:28:50.490186 containerd[1508]: time="2025-09-12T22:28:50.490174600Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490196400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490206800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490216200Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490248640Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490260600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490268960Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490277920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490285960Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490299640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:28:50.490317 containerd[1508]: time="2025-09-12T22:28:50.490309760Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:28:50.490479 containerd[1508]: time="2025-09-12T22:28:50.490390600Z" level=info msg="runtime interface created" Sep 12 22:28:50.490479 containerd[1508]: time="2025-09-12T22:28:50.490395960Z" level=info msg="created NRI interface" Sep 12 22:28:50.490479 containerd[1508]: time="2025-09-12T22:28:50.490404320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:28:50.490479 containerd[1508]: time="2025-09-12T22:28:50.490414440Z" level=info msg="Connect containerd service" Sep 12 22:28:50.490479 containerd[1508]: time="2025-09-12T22:28:50.490438640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:28:50.491717 containerd[1508]: time="2025-09-12T22:28:50.491568680Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:28:50.558875 containerd[1508]: time="2025-09-12T22:28:50.558835600Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.558896720Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.558918000Z" level=info msg="Start subscribing containerd event" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.558942040Z" level=info msg="Start recovering state" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559014080Z" level=info msg="Start event monitor" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559025440Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559032760Z" level=info msg="Start streaming server" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559040040Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559046440Z" level=info msg="runtime interface starting up..." Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559052360Z" level=info msg="starting plugins..." Sep 12 22:28:50.559050 containerd[1508]: time="2025-09-12T22:28:50.559076000Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:28:50.559519 containerd[1508]: time="2025-09-12T22:28:50.559180560Z" level=info msg="containerd successfully booted in 0.086479s" Sep 12 22:28:50.559291 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:28:50.602497 tar[1506]: linux-arm64/LICENSE Sep 12 22:28:50.602582 tar[1506]: linux-arm64/README.md Sep 12 22:28:50.621747 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:28:51.077214 sshd_keygen[1496]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:28:51.095560 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:28:51.098962 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:28:51.118627 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:28:51.118838 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:28:51.121816 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:28:51.130658 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:28:51.133659 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:28:51.136261 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 22:28:51.137593 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:28:51.975818 systemd-networkd[1437]: eth0: Gained IPv6LL Sep 12 22:28:51.978233 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:28:51.979967 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:28:51.982260 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 22:28:51.984632 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:28:51.992181 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:28:52.017932 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:28:52.020285 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 22:28:52.020659 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 22:28:52.022518 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:28:52.503400 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:28:52.504972 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:28:52.507952 (kubelet)[1611]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:28:52.509795 systemd[1]: Startup finished in 2.039s (kernel) + 5.009s (initrd) + 3.948s (userspace) = 10.997s. Sep 12 22:28:52.847386 kubelet[1611]: E0912 22:28:52.847259 1611 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:28:52.849418 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:28:52.849546 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:28:52.850074 systemd[1]: kubelet.service: Consumed 758ms CPU time, 256.3M memory peak. Sep 12 22:28:55.888530 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:28:55.889973 systemd[1]: Started sshd@0-10.0.0.148:22-10.0.0.1:43812.service - OpenSSH per-connection server daemon (10.0.0.1:43812). Sep 12 22:28:55.945382 sshd[1625]: Accepted publickey for core from 10.0.0.1 port 43812 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:55.946860 sshd-session[1625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:55.952156 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:28:55.952955 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:28:55.959428 systemd-logind[1485]: New session 1 of user core. Sep 12 22:28:55.970049 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:28:55.972248 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:28:55.987605 (systemd)[1630]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:28:55.989893 systemd-logind[1485]: New session c1 of user core. Sep 12 22:28:56.092180 systemd[1630]: Queued start job for default target default.target. Sep 12 22:28:56.114618 systemd[1630]: Created slice app.slice - User Application Slice. Sep 12 22:28:56.114644 systemd[1630]: Reached target paths.target - Paths. Sep 12 22:28:56.114708 systemd[1630]: Reached target timers.target - Timers. Sep 12 22:28:56.115860 systemd[1630]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:28:56.124929 systemd[1630]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:28:56.124986 systemd[1630]: Reached target sockets.target - Sockets. Sep 12 22:28:56.125021 systemd[1630]: Reached target basic.target - Basic System. Sep 12 22:28:56.125061 systemd[1630]: Reached target default.target - Main User Target. Sep 12 22:28:56.125090 systemd[1630]: Startup finished in 130ms. Sep 12 22:28:56.125243 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:28:56.126817 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:28:56.187300 systemd[1]: Started sshd@1-10.0.0.148:22-10.0.0.1:43820.service - OpenSSH per-connection server daemon (10.0.0.1:43820). Sep 12 22:28:56.231040 sshd[1642]: Accepted publickey for core from 10.0.0.1 port 43820 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.232162 sshd-session[1642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.236200 systemd-logind[1485]: New session 2 of user core. Sep 12 22:28:56.241880 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:28:56.291148 sshd[1645]: Connection closed by 10.0.0.1 port 43820 Sep 12 22:28:56.291501 sshd-session[1642]: pam_unix(sshd:session): session closed for user core Sep 12 22:28:56.305514 systemd[1]: sshd@1-10.0.0.148:22-10.0.0.1:43820.service: Deactivated successfully. Sep 12 22:28:56.307199 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 22:28:56.309186 systemd-logind[1485]: Session 2 logged out. Waiting for processes to exit. Sep 12 22:28:56.311866 systemd[1]: Started sshd@2-10.0.0.148:22-10.0.0.1:43830.service - OpenSSH per-connection server daemon (10.0.0.1:43830). Sep 12 22:28:56.312481 systemd-logind[1485]: Removed session 2. Sep 12 22:28:56.355233 sshd[1651]: Accepted publickey for core from 10.0.0.1 port 43830 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.356232 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.360549 systemd-logind[1485]: New session 3 of user core. Sep 12 22:28:56.365791 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:28:56.413685 sshd[1654]: Connection closed by 10.0.0.1 port 43830 Sep 12 22:28:56.414097 sshd-session[1651]: pam_unix(sshd:session): session closed for user core Sep 12 22:28:56.433451 systemd[1]: sshd@2-10.0.0.148:22-10.0.0.1:43830.service: Deactivated successfully. Sep 12 22:28:56.434824 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 22:28:56.436253 systemd-logind[1485]: Session 3 logged out. Waiting for processes to exit. Sep 12 22:28:56.438221 systemd[1]: Started sshd@3-10.0.0.148:22-10.0.0.1:43844.service - OpenSSH per-connection server daemon (10.0.0.1:43844). Sep 12 22:28:56.438987 systemd-logind[1485]: Removed session 3. Sep 12 22:28:56.500474 sshd[1660]: Accepted publickey for core from 10.0.0.1 port 43844 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.501524 sshd-session[1660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.504937 systemd-logind[1485]: New session 4 of user core. Sep 12 22:28:56.523808 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:28:56.573038 sshd[1663]: Connection closed by 10.0.0.1 port 43844 Sep 12 22:28:56.573839 sshd-session[1660]: pam_unix(sshd:session): session closed for user core Sep 12 22:28:56.583472 systemd[1]: sshd@3-10.0.0.148:22-10.0.0.1:43844.service: Deactivated successfully. Sep 12 22:28:56.585841 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:28:56.586495 systemd-logind[1485]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:28:56.588108 systemd[1]: Started sshd@4-10.0.0.148:22-10.0.0.1:43852.service - OpenSSH per-connection server daemon (10.0.0.1:43852). Sep 12 22:28:56.588905 systemd-logind[1485]: Removed session 4. Sep 12 22:28:56.643086 sshd[1669]: Accepted publickey for core from 10.0.0.1 port 43852 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.644200 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.647727 systemd-logind[1485]: New session 5 of user core. Sep 12 22:28:56.659809 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:28:56.714927 sudo[1673]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:28:56.715181 sudo[1673]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:28:56.728698 sudo[1673]: pam_unix(sudo:session): session closed for user root Sep 12 22:28:56.729952 sshd[1672]: Connection closed by 10.0.0.1 port 43852 Sep 12 22:28:56.730571 sshd-session[1669]: pam_unix(sshd:session): session closed for user core Sep 12 22:28:56.743415 systemd[1]: sshd@4-10.0.0.148:22-10.0.0.1:43852.service: Deactivated successfully. Sep 12 22:28:56.745765 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:28:56.746368 systemd-logind[1485]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:28:56.748301 systemd[1]: Started sshd@5-10.0.0.148:22-10.0.0.1:43858.service - OpenSSH per-connection server daemon (10.0.0.1:43858). Sep 12 22:28:56.749004 systemd-logind[1485]: Removed session 5. Sep 12 22:28:56.807656 sshd[1679]: Accepted publickey for core from 10.0.0.1 port 43858 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.808838 sshd-session[1679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.812166 systemd-logind[1485]: New session 6 of user core. Sep 12 22:28:56.827855 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:28:56.876503 sudo[1685]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:28:56.876761 sudo[1685]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:28:56.881053 sudo[1685]: pam_unix(sudo:session): session closed for user root Sep 12 22:28:56.885201 sudo[1684]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:28:56.885433 sudo[1684]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:28:56.893446 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:28:56.927803 augenrules[1707]: No rules Sep 12 22:28:56.928838 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:28:56.929755 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:28:56.930512 sudo[1684]: pam_unix(sudo:session): session closed for user root Sep 12 22:28:56.931707 sshd[1683]: Connection closed by 10.0.0.1 port 43858 Sep 12 22:28:56.932043 sshd-session[1679]: pam_unix(sshd:session): session closed for user core Sep 12 22:28:56.937426 systemd[1]: sshd@5-10.0.0.148:22-10.0.0.1:43858.service: Deactivated successfully. Sep 12 22:28:56.939825 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:28:56.940949 systemd-logind[1485]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:28:56.942390 systemd[1]: Started sshd@6-10.0.0.148:22-10.0.0.1:43864.service - OpenSSH per-connection server daemon (10.0.0.1:43864). Sep 12 22:28:56.943132 systemd-logind[1485]: Removed session 6. Sep 12 22:28:56.993466 sshd[1716]: Accepted publickey for core from 10.0.0.1 port 43864 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:28:56.994465 sshd-session[1716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:28:56.997712 systemd-logind[1485]: New session 7 of user core. Sep 12 22:28:57.010830 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:28:57.059796 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:28:57.060048 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:28:57.313059 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:28:57.326942 (dockerd)[1740]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:28:57.517917 dockerd[1740]: time="2025-09-12T22:28:57.517859369Z" level=info msg="Starting up" Sep 12 22:28:57.519055 dockerd[1740]: time="2025-09-12T22:28:57.518977545Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:28:57.528650 dockerd[1740]: time="2025-09-12T22:28:57.528619383Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:28:57.628509 dockerd[1740]: time="2025-09-12T22:28:57.628408136Z" level=info msg="Loading containers: start." Sep 12 22:28:57.636688 kernel: Initializing XFRM netlink socket Sep 12 22:28:57.834498 systemd-networkd[1437]: docker0: Link UP Sep 12 22:28:57.839886 dockerd[1740]: time="2025-09-12T22:28:57.839807599Z" level=info msg="Loading containers: done." Sep 12 22:28:57.856726 dockerd[1740]: time="2025-09-12T22:28:57.856678868Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:28:57.856862 dockerd[1740]: time="2025-09-12T22:28:57.856760544Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:28:57.856862 dockerd[1740]: time="2025-09-12T22:28:57.856839899Z" level=info msg="Initializing buildkit" Sep 12 22:28:57.877652 dockerd[1740]: time="2025-09-12T22:28:57.877587330Z" level=info msg="Completed buildkit initialization" Sep 12 22:28:57.882032 dockerd[1740]: time="2025-09-12T22:28:57.881940333Z" level=info msg="Daemon has completed initialization" Sep 12 22:28:57.882118 dockerd[1740]: time="2025-09-12T22:28:57.881980679Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:28:57.882226 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:28:58.408650 containerd[1508]: time="2025-09-12T22:28:58.408609097Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 22:28:58.970008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2503282861.mount: Deactivated successfully. Sep 12 22:28:59.796797 containerd[1508]: time="2025-09-12T22:28:59.796750100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:28:59.797750 containerd[1508]: time="2025-09-12T22:28:59.797719472Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=25687327" Sep 12 22:28:59.798813 containerd[1508]: time="2025-09-12T22:28:59.798470901Z" level=info msg="ImageCreate event name:\"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:28:59.800827 containerd[1508]: time="2025-09-12T22:28:59.800804350Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:28:59.802488 containerd[1508]: time="2025-09-12T22:28:59.802464664Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"25683924\" in 1.393814644s" Sep 12 22:28:59.802530 containerd[1508]: time="2025-09-12T22:28:59.802498105Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\"" Sep 12 22:28:59.803662 containerd[1508]: time="2025-09-12T22:28:59.803635869Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 22:29:00.769336 containerd[1508]: time="2025-09-12T22:29:00.769285572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:00.770050 containerd[1508]: time="2025-09-12T22:29:00.770017114Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=22459769" Sep 12 22:29:00.770658 containerd[1508]: time="2025-09-12T22:29:00.770604705Z" level=info msg="ImageCreate event name:\"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:00.773782 containerd[1508]: time="2025-09-12T22:29:00.773733948Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:00.775603 containerd[1508]: time="2025-09-12T22:29:00.775565967Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"24028542\" in 971.901027ms" Sep 12 22:29:00.775708 containerd[1508]: time="2025-09-12T22:29:00.775691846Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\"" Sep 12 22:29:00.776190 containerd[1508]: time="2025-09-12T22:29:00.776164038Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 22:29:01.756513 containerd[1508]: time="2025-09-12T22:29:01.756460977Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:01.758606 containerd[1508]: time="2025-09-12T22:29:01.758554191Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=17127508" Sep 12 22:29:01.759864 containerd[1508]: time="2025-09-12T22:29:01.759836480Z" level=info msg="ImageCreate event name:\"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:01.762768 containerd[1508]: time="2025-09-12T22:29:01.762727115Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:01.763658 containerd[1508]: time="2025-09-12T22:29:01.763628852Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"18696299\" in 987.433719ms" Sep 12 22:29:01.763741 containerd[1508]: time="2025-09-12T22:29:01.763660575Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\"" Sep 12 22:29:01.764315 containerd[1508]: time="2025-09-12T22:29:01.764291858Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 22:29:02.713660 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount885035859.mount: Deactivated successfully. Sep 12 22:29:02.982065 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:29:02.983427 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:03.137119 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:03.140795 (kubelet)[2043]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:29:03.143323 containerd[1508]: time="2025-09-12T22:29:03.143266578Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:03.143934 containerd[1508]: time="2025-09-12T22:29:03.143881041Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=26954909" Sep 12 22:29:03.144727 containerd[1508]: time="2025-09-12T22:29:03.144696608Z" level=info msg="ImageCreate event name:\"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:03.146852 containerd[1508]: time="2025-09-12T22:29:03.146821169Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:03.147432 containerd[1508]: time="2025-09-12T22:29:03.147368982Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"26953926\" in 1.382964671s" Sep 12 22:29:03.147432 containerd[1508]: time="2025-09-12T22:29:03.147429520Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\"" Sep 12 22:29:03.148003 containerd[1508]: time="2025-09-12T22:29:03.147978206Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 22:29:03.186244 kubelet[2043]: E0912 22:29:03.186186 2043 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:29:03.189551 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:29:03.189715 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:29:03.190078 systemd[1]: kubelet.service: Consumed 153ms CPU time, 108.9M memory peak. Sep 12 22:29:03.628377 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3565868673.mount: Deactivated successfully. Sep 12 22:29:04.232576 containerd[1508]: time="2025-09-12T22:29:04.232519452Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:04.233896 containerd[1508]: time="2025-09-12T22:29:04.233864783Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 12 22:29:04.234992 containerd[1508]: time="2025-09-12T22:29:04.234948026Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:04.238406 containerd[1508]: time="2025-09-12T22:29:04.238360187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:04.240031 containerd[1508]: time="2025-09-12T22:29:04.239921913Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.091904988s" Sep 12 22:29:04.240031 containerd[1508]: time="2025-09-12T22:29:04.239977064Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 22:29:04.240465 containerd[1508]: time="2025-09-12T22:29:04.240388078Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:29:04.714556 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1765462764.mount: Deactivated successfully. Sep 12 22:29:04.720027 containerd[1508]: time="2025-09-12T22:29:04.719990830Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:29:04.720573 containerd[1508]: time="2025-09-12T22:29:04.720548742Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 12 22:29:04.721575 containerd[1508]: time="2025-09-12T22:29:04.721532926Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:29:04.723829 containerd[1508]: time="2025-09-12T22:29:04.723784363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:29:04.725208 containerd[1508]: time="2025-09-12T22:29:04.725105417Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 484.541672ms" Sep 12 22:29:04.725208 containerd[1508]: time="2025-09-12T22:29:04.725132794Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 22:29:04.725587 containerd[1508]: time="2025-09-12T22:29:04.725544563Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 22:29:05.187548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount825494976.mount: Deactivated successfully. Sep 12 22:29:06.696639 containerd[1508]: time="2025-09-12T22:29:06.696567985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:06.697574 containerd[1508]: time="2025-09-12T22:29:06.697532339Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537163" Sep 12 22:29:06.697982 containerd[1508]: time="2025-09-12T22:29:06.697958244Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:06.701308 containerd[1508]: time="2025-09-12T22:29:06.701272437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:06.701960 containerd[1508]: time="2025-09-12T22:29:06.701923034Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 1.97622302s" Sep 12 22:29:06.702006 containerd[1508]: time="2025-09-12T22:29:06.701964025Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 12 22:29:13.232243 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:29:13.233611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:13.392906 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:13.396812 (kubelet)[2192]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:29:13.434272 kubelet[2192]: E0912 22:29:13.434232 2192 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:29:13.436523 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:29:13.436646 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:29:13.437094 systemd[1]: kubelet.service: Consumed 131ms CPU time, 107M memory peak. Sep 12 22:29:13.541004 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:13.541142 systemd[1]: kubelet.service: Consumed 131ms CPU time, 107M memory peak. Sep 12 22:29:13.543499 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:13.565888 systemd[1]: Reload requested from client PID 2208 ('systemctl') (unit session-7.scope)... Sep 12 22:29:13.565900 systemd[1]: Reloading... Sep 12 22:29:13.633712 zram_generator::config[2252]: No configuration found. Sep 12 22:29:13.867183 systemd[1]: Reloading finished in 300 ms. Sep 12 22:29:13.909027 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:13.911633 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:13.912352 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:29:13.913709 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:13.913744 systemd[1]: kubelet.service: Consumed 91ms CPU time, 95.1M memory peak. Sep 12 22:29:13.914961 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:14.022060 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:14.026604 (kubelet)[2298]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:29:14.060417 kubelet[2298]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:29:14.060748 kubelet[2298]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:29:14.060795 kubelet[2298]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:29:14.060962 kubelet[2298]: I0912 22:29:14.060928 2298 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:29:14.733701 kubelet[2298]: I0912 22:29:14.732908 2298 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:29:14.733701 kubelet[2298]: I0912 22:29:14.732940 2298 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:29:14.733701 kubelet[2298]: I0912 22:29:14.733216 2298 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:29:14.749156 kubelet[2298]: E0912 22:29:14.749117 2298 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.148:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:14.750051 kubelet[2298]: I0912 22:29:14.750019 2298 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:29:14.757808 kubelet[2298]: I0912 22:29:14.757776 2298 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:29:14.761272 kubelet[2298]: I0912 22:29:14.761249 2298 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:29:14.762041 kubelet[2298]: I0912 22:29:14.762021 2298 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:29:14.762182 kubelet[2298]: I0912 22:29:14.762154 2298 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:29:14.762405 kubelet[2298]: I0912 22:29:14.762189 2298 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:29:14.762489 kubelet[2298]: I0912 22:29:14.762470 2298 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:29:14.762489 kubelet[2298]: I0912 22:29:14.762479 2298 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:29:14.762754 kubelet[2298]: I0912 22:29:14.762741 2298 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:29:14.764708 kubelet[2298]: I0912 22:29:14.764636 2298 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:29:14.764708 kubelet[2298]: I0912 22:29:14.764666 2298 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:29:14.764708 kubelet[2298]: I0912 22:29:14.764696 2298 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:29:14.764708 kubelet[2298]: I0912 22:29:14.764706 2298 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:29:14.768133 kubelet[2298]: I0912 22:29:14.768110 2298 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:29:14.769700 kubelet[2298]: I0912 22:29:14.769050 2298 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:29:14.769700 kubelet[2298]: W0912 22:29:14.769367 2298 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:29:14.769700 kubelet[2298]: W0912 22:29:14.769569 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.148:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:14.769700 kubelet[2298]: W0912 22:29:14.769594 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.148:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:14.769700 kubelet[2298]: E0912 22:29:14.769632 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.148:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:14.769700 kubelet[2298]: E0912 22:29:14.769645 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.148:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:14.770379 kubelet[2298]: I0912 22:29:14.770359 2298 server.go:1274] "Started kubelet" Sep 12 22:29:14.771382 kubelet[2298]: I0912 22:29:14.771327 2298 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:29:14.772420 kubelet[2298]: I0912 22:29:14.772383 2298 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:29:14.772569 kubelet[2298]: I0912 22:29:14.772541 2298 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:29:14.772867 kubelet[2298]: I0912 22:29:14.772851 2298 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:29:14.773100 kubelet[2298]: I0912 22:29:14.773080 2298 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:29:14.773624 kubelet[2298]: I0912 22:29:14.773595 2298 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:29:14.773834 kubelet[2298]: E0912 22:29:14.772883 2298 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.148:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.148:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864a987ad0d233d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 22:29:14.770334525 +0000 UTC m=+0.740264511,LastTimestamp:2025-09-12 22:29:14.770334525 +0000 UTC m=+0.740264511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 22:29:14.774267 kubelet[2298]: E0912 22:29:14.774245 2298 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:29:14.774316 kubelet[2298]: I0912 22:29:14.774280 2298 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:29:14.774658 kubelet[2298]: I0912 22:29:14.774429 2298 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:29:14.774658 kubelet[2298]: I0912 22:29:14.774491 2298 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:29:14.774939 kubelet[2298]: W0912 22:29:14.774884 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.148:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:14.774996 kubelet[2298]: E0912 22:29:14.774941 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.148:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:14.775163 kubelet[2298]: I0912 22:29:14.775142 2298 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:29:14.775163 kubelet[2298]: E0912 22:29:14.775136 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.148:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.148:6443: connect: connection refused" interval="200ms" Sep 12 22:29:14.775223 kubelet[2298]: I0912 22:29:14.775213 2298 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:29:14.776279 kubelet[2298]: I0912 22:29:14.776248 2298 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:29:14.776434 kubelet[2298]: E0912 22:29:14.776415 2298 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:29:14.787977 kubelet[2298]: I0912 22:29:14.787925 2298 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:29:14.789173 kubelet[2298]: I0912 22:29:14.789137 2298 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:29:14.789173 kubelet[2298]: I0912 22:29:14.789167 2298 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:29:14.789264 kubelet[2298]: I0912 22:29:14.789185 2298 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:29:14.789264 kubelet[2298]: E0912 22:29:14.789227 2298 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:29:14.790395 kubelet[2298]: W0912 22:29:14.790185 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.148:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:14.790395 kubelet[2298]: E0912 22:29:14.790228 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.148:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:14.791434 kubelet[2298]: I0912 22:29:14.791180 2298 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:29:14.791434 kubelet[2298]: I0912 22:29:14.791198 2298 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:29:14.791434 kubelet[2298]: I0912 22:29:14.791215 2298 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:29:14.869187 kubelet[2298]: I0912 22:29:14.869155 2298 policy_none.go:49] "None policy: Start" Sep 12 22:29:14.870077 kubelet[2298]: I0912 22:29:14.870053 2298 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:29:14.870077 kubelet[2298]: I0912 22:29:14.870081 2298 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:29:14.874870 kubelet[2298]: E0912 22:29:14.874836 2298 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:29:14.877746 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:29:14.889608 kubelet[2298]: E0912 22:29:14.889571 2298 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 22:29:14.904743 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:29:14.908057 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:29:14.924465 kubelet[2298]: I0912 22:29:14.924417 2298 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:29:14.924726 kubelet[2298]: I0912 22:29:14.924622 2298 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:29:14.924726 kubelet[2298]: I0912 22:29:14.924646 2298 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:29:14.924882 kubelet[2298]: I0912 22:29:14.924856 2298 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:29:14.926563 kubelet[2298]: E0912 22:29:14.926521 2298 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 22:29:14.976176 kubelet[2298]: E0912 22:29:14.976112 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.148:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.148:6443: connect: connection refused" interval="400ms" Sep 12 22:29:15.026429 kubelet[2298]: I0912 22:29:15.026327 2298 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:29:15.027537 kubelet[2298]: E0912 22:29:15.026877 2298 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.148:6443/api/v1/nodes\": dial tcp 10.0.0.148:6443: connect: connection refused" node="localhost" Sep 12 22:29:15.099160 systemd[1]: Created slice kubepods-burstable-pod2b9afaba699092826ab9eae314e8fc60.slice - libcontainer container kubepods-burstable-pod2b9afaba699092826ab9eae314e8fc60.slice. Sep 12 22:29:15.123488 systemd[1]: Created slice kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice - libcontainer container kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice. Sep 12 22:29:15.127417 systemd[1]: Created slice kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice - libcontainer container kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice. Sep 12 22:29:15.177556 kubelet[2298]: I0912 22:29:15.177402 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:15.177556 kubelet[2298]: I0912 22:29:15.177449 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:15.177556 kubelet[2298]: I0912 22:29:15.177471 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:29:15.177556 kubelet[2298]: I0912 22:29:15.177488 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:15.177556 kubelet[2298]: I0912 22:29:15.177503 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:15.178004 kubelet[2298]: I0912 22:29:15.177518 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:15.178004 kubelet[2298]: I0912 22:29:15.177561 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:15.178004 kubelet[2298]: I0912 22:29:15.177590 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:15.178004 kubelet[2298]: I0912 22:29:15.177613 2298 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:15.228498 kubelet[2298]: I0912 22:29:15.228467 2298 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:29:15.228843 kubelet[2298]: E0912 22:29:15.228795 2298 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.148:6443/api/v1/nodes\": dial tcp 10.0.0.148:6443: connect: connection refused" node="localhost" Sep 12 22:29:15.377431 kubelet[2298]: E0912 22:29:15.377305 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.148:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.148:6443: connect: connection refused" interval="800ms" Sep 12 22:29:15.422465 containerd[1508]: time="2025-09-12T22:29:15.422339461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2b9afaba699092826ab9eae314e8fc60,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:15.427059 containerd[1508]: time="2025-09-12T22:29:15.427029844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:15.430618 containerd[1508]: time="2025-09-12T22:29:15.430575579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:15.443821 containerd[1508]: time="2025-09-12T22:29:15.443760085Z" level=info msg="connecting to shim 5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940" address="unix:///run/containerd/s/0830ff93496d20f1a2f9b0b72d95e4ef1527808fdd516e2aadbf136b7ab32840" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:15.460334 containerd[1508]: time="2025-09-12T22:29:15.460278969Z" level=info msg="connecting to shim 329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c" address="unix:///run/containerd/s/dd947400f2bc6e53817179f85cce13eca6939c6d25e7cbfff74293fa9e246973" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:15.463929 containerd[1508]: time="2025-09-12T22:29:15.463747701Z" level=info msg="connecting to shim 5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae" address="unix:///run/containerd/s/642ff1cb4f924e764d27975cabcb5aa7ebedffe7d28cef79d78c218780c2db26" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:15.473854 systemd[1]: Started cri-containerd-5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940.scope - libcontainer container 5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940. Sep 12 22:29:15.491837 systemd[1]: Started cri-containerd-329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c.scope - libcontainer container 329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c. Sep 12 22:29:15.496032 systemd[1]: Started cri-containerd-5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae.scope - libcontainer container 5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae. Sep 12 22:29:15.528337 containerd[1508]: time="2025-09-12T22:29:15.528292977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2b9afaba699092826ab9eae314e8fc60,Namespace:kube-system,Attempt:0,} returns sandbox id \"5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940\"" Sep 12 22:29:15.532337 containerd[1508]: time="2025-09-12T22:29:15.532306756Z" level=info msg="CreateContainer within sandbox \"5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:29:15.537179 containerd[1508]: time="2025-09-12T22:29:15.537146910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c\"" Sep 12 22:29:15.540151 containerd[1508]: time="2025-09-12T22:29:15.540123195Z" level=info msg="CreateContainer within sandbox \"329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:29:15.542399 containerd[1508]: time="2025-09-12T22:29:15.542374870Z" level=info msg="Container 9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:15.542938 containerd[1508]: time="2025-09-12T22:29:15.542917200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae\"" Sep 12 22:29:15.545199 containerd[1508]: time="2025-09-12T22:29:15.545146030Z" level=info msg="CreateContainer within sandbox \"5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:29:15.550339 containerd[1508]: time="2025-09-12T22:29:15.550313123Z" level=info msg="Container 688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:15.553166 containerd[1508]: time="2025-09-12T22:29:15.553123224Z" level=info msg="CreateContainer within sandbox \"5a8db3427eabd596dd598fbdc70cde1568655c9f7936a8bc30c632aa5726d940\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e\"" Sep 12 22:29:15.553631 containerd[1508]: time="2025-09-12T22:29:15.553594383Z" level=info msg="StartContainer for \"9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e\"" Sep 12 22:29:15.555211 containerd[1508]: time="2025-09-12T22:29:15.555179597Z" level=info msg="connecting to shim 9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e" address="unix:///run/containerd/s/0830ff93496d20f1a2f9b0b72d95e4ef1527808fdd516e2aadbf136b7ab32840" protocol=ttrpc version=3 Sep 12 22:29:15.556171 containerd[1508]: time="2025-09-12T22:29:15.556138570Z" level=info msg="CreateContainer within sandbox \"329e093f0dca4bdfedb482f1e5db37a0f593d58cab52825035a15780b9a6bd6c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026\"" Sep 12 22:29:15.558216 containerd[1508]: time="2025-09-12T22:29:15.556374608Z" level=info msg="Container 60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:15.558216 containerd[1508]: time="2025-09-12T22:29:15.556538837Z" level=info msg="StartContainer for \"688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026\"" Sep 12 22:29:15.558216 containerd[1508]: time="2025-09-12T22:29:15.557648699Z" level=info msg="connecting to shim 688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026" address="unix:///run/containerd/s/dd947400f2bc6e53817179f85cce13eca6939c6d25e7cbfff74293fa9e246973" protocol=ttrpc version=3 Sep 12 22:29:15.561992 containerd[1508]: time="2025-09-12T22:29:15.561960421Z" level=info msg="CreateContainer within sandbox \"5d1b0ef63f561800475c2449e5b5fc49bd330f7f1180b45793eb3627c38f72ae\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778\"" Sep 12 22:29:15.562532 containerd[1508]: time="2025-09-12T22:29:15.562481224Z" level=info msg="StartContainer for \"60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778\"" Sep 12 22:29:15.563499 containerd[1508]: time="2025-09-12T22:29:15.563467875Z" level=info msg="connecting to shim 60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778" address="unix:///run/containerd/s/642ff1cb4f924e764d27975cabcb5aa7ebedffe7d28cef79d78c218780c2db26" protocol=ttrpc version=3 Sep 12 22:29:15.571831 systemd[1]: Started cri-containerd-9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e.scope - libcontainer container 9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e. Sep 12 22:29:15.575163 systemd[1]: Started cri-containerd-688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026.scope - libcontainer container 688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026. Sep 12 22:29:15.582760 kubelet[2298]: W0912 22:29:15.582702 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.148:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:15.582834 kubelet[2298]: E0912 22:29:15.582767 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.148:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:15.594832 systemd[1]: Started cri-containerd-60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778.scope - libcontainer container 60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778. Sep 12 22:29:15.632325 kubelet[2298]: I0912 22:29:15.632159 2298 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:29:15.633346 kubelet[2298]: E0912 22:29:15.632838 2298 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.148:6443/api/v1/nodes\": dial tcp 10.0.0.148:6443: connect: connection refused" node="localhost" Sep 12 22:29:15.639857 containerd[1508]: time="2025-09-12T22:29:15.639823639Z" level=info msg="StartContainer for \"9c9560e9b9b10985eb3540de72ab6f1c96d9a0b3c6bd5862f67e5496f508b52e\" returns successfully" Sep 12 22:29:15.640064 containerd[1508]: time="2025-09-12T22:29:15.640044102Z" level=info msg="StartContainer for \"688dad07e35641234ad8004a7fed83b0f8105eb560fa70a789c2214f8d3d3026\" returns successfully" Sep 12 22:29:15.651107 containerd[1508]: time="2025-09-12T22:29:15.651052538Z" level=info msg="StartContainer for \"60f90d1f9ce8944e3f15524155787d9746f9c5faa279b95eab8879a805ff7778\" returns successfully" Sep 12 22:29:15.663690 kubelet[2298]: W0912 22:29:15.663601 2298 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.148:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.148:6443: connect: connection refused Sep 12 22:29:15.665944 kubelet[2298]: E0912 22:29:15.665727 2298 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.148:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.148:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:29:16.435032 kubelet[2298]: I0912 22:29:16.435002 2298 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:29:16.953142 kubelet[2298]: E0912 22:29:16.953105 2298 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 22:29:17.015042 kubelet[2298]: I0912 22:29:17.014999 2298 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 12 22:29:17.766468 kubelet[2298]: I0912 22:29:17.766388 2298 apiserver.go:52] "Watching apiserver" Sep 12 22:29:17.774908 kubelet[2298]: I0912 22:29:17.774829 2298 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:29:18.961207 systemd[1]: Reload requested from client PID 2573 ('systemctl') (unit session-7.scope)... Sep 12 22:29:18.961222 systemd[1]: Reloading... Sep 12 22:29:19.027726 zram_generator::config[2619]: No configuration found. Sep 12 22:29:19.260973 systemd[1]: Reloading finished in 299 ms. Sep 12 22:29:19.284926 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:19.296929 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:29:19.297114 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:19.297167 systemd[1]: kubelet.service: Consumed 1.105s CPU time, 126.5M memory peak. Sep 12 22:29:19.299080 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:29:19.435942 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:29:19.439180 (kubelet)[2658]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:29:19.476271 kubelet[2658]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:29:19.476271 kubelet[2658]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:29:19.476271 kubelet[2658]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:29:19.476578 kubelet[2658]: I0912 22:29:19.476325 2658 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:29:19.481701 kubelet[2658]: I0912 22:29:19.481588 2658 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:29:19.481701 kubelet[2658]: I0912 22:29:19.481620 2658 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:29:19.481917 kubelet[2658]: I0912 22:29:19.481899 2658 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:29:19.483321 kubelet[2658]: I0912 22:29:19.483293 2658 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 22:29:19.486380 kubelet[2658]: I0912 22:29:19.486355 2658 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:29:19.490139 kubelet[2658]: I0912 22:29:19.490121 2658 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:29:19.492685 kubelet[2658]: I0912 22:29:19.492576 2658 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:29:19.492805 kubelet[2658]: I0912 22:29:19.492792 2658 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:29:19.492989 kubelet[2658]: I0912 22:29:19.492954 2658 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:29:19.493197 kubelet[2658]: I0912 22:29:19.493043 2658 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:29:19.493317 kubelet[2658]: I0912 22:29:19.493304 2658 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:29:19.493363 kubelet[2658]: I0912 22:29:19.493356 2658 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:29:19.493440 kubelet[2658]: I0912 22:29:19.493431 2658 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:29:19.493595 kubelet[2658]: I0912 22:29:19.493582 2658 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:29:19.493723 kubelet[2658]: I0912 22:29:19.493668 2658 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:29:19.493869 kubelet[2658]: I0912 22:29:19.493820 2658 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:29:19.493869 kubelet[2658]: I0912 22:29:19.493840 2658 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:29:19.495701 kubelet[2658]: I0912 22:29:19.494958 2658 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:29:19.495701 kubelet[2658]: I0912 22:29:19.495398 2658 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:29:19.496789 kubelet[2658]: I0912 22:29:19.496768 2658 server.go:1274] "Started kubelet" Sep 12 22:29:19.497095 kubelet[2658]: I0912 22:29:19.497020 2658 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:29:19.497286 kubelet[2658]: I0912 22:29:19.497246 2658 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:29:19.498643 kubelet[2658]: I0912 22:29:19.497528 2658 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:29:19.500679 kubelet[2658]: I0912 22:29:19.497753 2658 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:29:19.500679 kubelet[2658]: I0912 22:29:19.499138 2658 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:29:19.503245 kubelet[2658]: I0912 22:29:19.503220 2658 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:29:19.503404 kubelet[2658]: I0912 22:29:19.503381 2658 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:29:19.503434 kubelet[2658]: E0912 22:29:19.503404 2658 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:29:19.504224 kubelet[2658]: I0912 22:29:19.504195 2658 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:29:19.504404 kubelet[2658]: I0912 22:29:19.504351 2658 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:29:19.505268 kubelet[2658]: I0912 22:29:19.505239 2658 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:29:19.505448 kubelet[2658]: I0912 22:29:19.505428 2658 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:29:19.507917 kubelet[2658]: I0912 22:29:19.507886 2658 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:29:19.509429 kubelet[2658]: I0912 22:29:19.509408 2658 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:29:19.509429 kubelet[2658]: I0912 22:29:19.509430 2658 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:29:19.509500 kubelet[2658]: I0912 22:29:19.509446 2658 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:29:19.509500 kubelet[2658]: E0912 22:29:19.509479 2658 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:29:19.515002 kubelet[2658]: I0912 22:29:19.514921 2658 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:29:19.555145 kubelet[2658]: I0912 22:29:19.555110 2658 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:29:19.555145 kubelet[2658]: I0912 22:29:19.555129 2658 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:29:19.555145 kubelet[2658]: I0912 22:29:19.555147 2658 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:29:19.555298 kubelet[2658]: I0912 22:29:19.555269 2658 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:29:19.555298 kubelet[2658]: I0912 22:29:19.555279 2658 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:29:19.555298 kubelet[2658]: I0912 22:29:19.555295 2658 policy_none.go:49] "None policy: Start" Sep 12 22:29:19.555934 kubelet[2658]: I0912 22:29:19.555920 2658 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:29:19.555980 kubelet[2658]: I0912 22:29:19.555939 2658 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:29:19.556057 kubelet[2658]: I0912 22:29:19.556044 2658 state_mem.go:75] "Updated machine memory state" Sep 12 22:29:19.559706 kubelet[2658]: I0912 22:29:19.559669 2658 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:29:19.559888 kubelet[2658]: I0912 22:29:19.559857 2658 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:29:19.559940 kubelet[2658]: I0912 22:29:19.559883 2658 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:29:19.560494 kubelet[2658]: I0912 22:29:19.560314 2658 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:29:19.616035 kubelet[2658]: E0912 22:29:19.615780 2658 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:19.662372 kubelet[2658]: I0912 22:29:19.662346 2658 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 12 22:29:19.667252 kubelet[2658]: I0912 22:29:19.667230 2658 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 12 22:29:19.667327 kubelet[2658]: I0912 22:29:19.667292 2658 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 12 22:29:19.807435 kubelet[2658]: I0912 22:29:19.806829 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:19.807435 kubelet[2658]: I0912 22:29:19.806859 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:19.807435 kubelet[2658]: I0912 22:29:19.806879 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:29:19.807435 kubelet[2658]: I0912 22:29:19.806949 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:19.807435 kubelet[2658]: I0912 22:29:19.806969 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b9afaba699092826ab9eae314e8fc60-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2b9afaba699092826ab9eae314e8fc60\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:19.807586 kubelet[2658]: I0912 22:29:19.807003 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:19.807586 kubelet[2658]: I0912 22:29:19.807021 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:19.807586 kubelet[2658]: I0912 22:29:19.807035 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:19.807586 kubelet[2658]: I0912 22:29:19.807086 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:20.495057 kubelet[2658]: I0912 22:29:20.495021 2658 apiserver.go:52] "Watching apiserver" Sep 12 22:29:20.505830 kubelet[2658]: I0912 22:29:20.505804 2658 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:29:20.544701 kubelet[2658]: E0912 22:29:20.544363 2658 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:29:20.545524 kubelet[2658]: E0912 22:29:20.545488 2658 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 22:29:20.556653 kubelet[2658]: I0912 22:29:20.556601 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.556575566 podStartE2EDuration="1.556575566s" podCreationTimestamp="2025-09-12 22:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:29:20.55632936 +0000 UTC m=+1.114588101" watchObservedRunningTime="2025-09-12 22:29:20.556575566 +0000 UTC m=+1.114834307" Sep 12 22:29:20.569715 kubelet[2658]: I0912 22:29:20.569591 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.5695808470000001 podStartE2EDuration="1.569580847s" podCreationTimestamp="2025-09-12 22:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:29:20.569532565 +0000 UTC m=+1.127791346" watchObservedRunningTime="2025-09-12 22:29:20.569580847 +0000 UTC m=+1.127839588" Sep 12 22:29:20.569819 kubelet[2658]: I0912 22:29:20.569715 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.5697101460000003 podStartE2EDuration="2.569710146s" podCreationTimestamp="2025-09-12 22:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:29:20.563019513 +0000 UTC m=+1.121278254" watchObservedRunningTime="2025-09-12 22:29:20.569710146 +0000 UTC m=+1.127968887" Sep 12 22:29:24.672305 kubelet[2658]: I0912 22:29:24.672206 2658 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:29:24.673047 containerd[1508]: time="2025-09-12T22:29:24.672926613Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:29:24.673397 kubelet[2658]: I0912 22:29:24.673115 2658 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:29:25.573802 systemd[1]: Created slice kubepods-besteffort-pod67e209f5_1457_4322_8f51_fd33dd175936.slice - libcontainer container kubepods-besteffort-pod67e209f5_1457_4322_8f51_fd33dd175936.slice. Sep 12 22:29:25.646164 kubelet[2658]: I0912 22:29:25.646107 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/67e209f5-1457-4322-8f51-fd33dd175936-kube-proxy\") pod \"kube-proxy-rgw2r\" (UID: \"67e209f5-1457-4322-8f51-fd33dd175936\") " pod="kube-system/kube-proxy-rgw2r" Sep 12 22:29:25.646277 kubelet[2658]: I0912 22:29:25.646174 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/67e209f5-1457-4322-8f51-fd33dd175936-xtables-lock\") pod \"kube-proxy-rgw2r\" (UID: \"67e209f5-1457-4322-8f51-fd33dd175936\") " pod="kube-system/kube-proxy-rgw2r" Sep 12 22:29:25.646277 kubelet[2658]: I0912 22:29:25.646193 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpd8\" (UniqueName: \"kubernetes.io/projected/67e209f5-1457-4322-8f51-fd33dd175936-kube-api-access-dkpd8\") pod \"kube-proxy-rgw2r\" (UID: \"67e209f5-1457-4322-8f51-fd33dd175936\") " pod="kube-system/kube-proxy-rgw2r" Sep 12 22:29:25.646277 kubelet[2658]: I0912 22:29:25.646211 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67e209f5-1457-4322-8f51-fd33dd175936-lib-modules\") pod \"kube-proxy-rgw2r\" (UID: \"67e209f5-1457-4322-8f51-fd33dd175936\") " pod="kube-system/kube-proxy-rgw2r" Sep 12 22:29:25.785104 systemd[1]: Created slice kubepods-besteffort-pod649b8b3d_586a_4f02_9d18_fe63b11e77c0.slice - libcontainer container kubepods-besteffort-pod649b8b3d_586a_4f02_9d18_fe63b11e77c0.slice. Sep 12 22:29:25.847750 kubelet[2658]: I0912 22:29:25.847637 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttk69\" (UniqueName: \"kubernetes.io/projected/649b8b3d-586a-4f02-9d18-fe63b11e77c0-kube-api-access-ttk69\") pod \"tigera-operator-58fc44c59b-tq275\" (UID: \"649b8b3d-586a-4f02-9d18-fe63b11e77c0\") " pod="tigera-operator/tigera-operator-58fc44c59b-tq275" Sep 12 22:29:25.847750 kubelet[2658]: I0912 22:29:25.847696 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/649b8b3d-586a-4f02-9d18-fe63b11e77c0-var-lib-calico\") pod \"tigera-operator-58fc44c59b-tq275\" (UID: \"649b8b3d-586a-4f02-9d18-fe63b11e77c0\") " pod="tigera-operator/tigera-operator-58fc44c59b-tq275" Sep 12 22:29:25.890305 containerd[1508]: time="2025-09-12T22:29:25.890261078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rgw2r,Uid:67e209f5-1457-4322-8f51-fd33dd175936,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:25.908090 containerd[1508]: time="2025-09-12T22:29:25.908056962Z" level=info msg="connecting to shim 3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e" address="unix:///run/containerd/s/cb31c7f2e70396c97d449eaa6fb0325dc6da464b8a5880a3f92252e0b65ee035" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:25.931846 systemd[1]: Started cri-containerd-3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e.scope - libcontainer container 3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e. Sep 12 22:29:25.950412 containerd[1508]: time="2025-09-12T22:29:25.950376935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rgw2r,Uid:67e209f5-1457-4322-8f51-fd33dd175936,Namespace:kube-system,Attempt:0,} returns sandbox id \"3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e\"" Sep 12 22:29:25.956058 containerd[1508]: time="2025-09-12T22:29:25.956028565Z" level=info msg="CreateContainer within sandbox \"3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:29:25.969288 containerd[1508]: time="2025-09-12T22:29:25.969251556Z" level=info msg="Container 1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:25.976803 containerd[1508]: time="2025-09-12T22:29:25.976764193Z" level=info msg="CreateContainer within sandbox \"3ef01087172b43256fb442324f83fa41f441970041b85f005f1eb86aef74da3e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a\"" Sep 12 22:29:25.978350 containerd[1508]: time="2025-09-12T22:29:25.977224515Z" level=info msg="StartContainer for \"1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a\"" Sep 12 22:29:25.980593 containerd[1508]: time="2025-09-12T22:29:25.980556895Z" level=info msg="connecting to shim 1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a" address="unix:///run/containerd/s/cb31c7f2e70396c97d449eaa6fb0325dc6da464b8a5880a3f92252e0b65ee035" protocol=ttrpc version=3 Sep 12 22:29:26.003821 systemd[1]: Started cri-containerd-1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a.scope - libcontainer container 1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a. Sep 12 22:29:26.040554 containerd[1508]: time="2025-09-12T22:29:26.040512266Z" level=info msg="StartContainer for \"1c024eb86340c51a768a11b4a9cfd7c94bd67b886e97eb0e0e28b47360f9d04a\" returns successfully" Sep 12 22:29:26.088306 containerd[1508]: time="2025-09-12T22:29:26.088262213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-tq275,Uid:649b8b3d-586a-4f02-9d18-fe63b11e77c0,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:29:26.103496 containerd[1508]: time="2025-09-12T22:29:26.103124799Z" level=info msg="connecting to shim ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99" address="unix:///run/containerd/s/c51b5150e11d11f42cb13b7eb853b256a02a19cf159141f3f7fad8190b505716" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:26.123856 systemd[1]: Started cri-containerd-ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99.scope - libcontainer container ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99. Sep 12 22:29:26.158290 containerd[1508]: time="2025-09-12T22:29:26.158246534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-tq275,Uid:649b8b3d-586a-4f02-9d18-fe63b11e77c0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99\"" Sep 12 22:29:26.161875 containerd[1508]: time="2025-09-12T22:29:26.161839720Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:29:26.560131 kubelet[2658]: I0912 22:29:26.560066 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rgw2r" podStartSLOduration=1.560050238 podStartE2EDuration="1.560050238s" podCreationTimestamp="2025-09-12 22:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:29:26.559913587 +0000 UTC m=+7.118172328" watchObservedRunningTime="2025-09-12 22:29:26.560050238 +0000 UTC m=+7.118308979" Sep 12 22:29:27.895314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2612926361.mount: Deactivated successfully. Sep 12 22:29:28.217236 containerd[1508]: time="2025-09-12T22:29:28.217125111Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:28.217830 containerd[1508]: time="2025-09-12T22:29:28.217798443Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 22:29:28.218554 containerd[1508]: time="2025-09-12T22:29:28.218504697Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:28.220549 containerd[1508]: time="2025-09-12T22:29:28.220514130Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:28.221180 containerd[1508]: time="2025-09-12T22:29:28.221145418Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.059268414s" Sep 12 22:29:28.221180 containerd[1508]: time="2025-09-12T22:29:28.221176060Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 22:29:28.225089 containerd[1508]: time="2025-09-12T22:29:28.225058956Z" level=info msg="CreateContainer within sandbox \"ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:29:28.230836 containerd[1508]: time="2025-09-12T22:29:28.230803794Z" level=info msg="Container 5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:28.236958 containerd[1508]: time="2025-09-12T22:29:28.236921780Z" level=info msg="CreateContainer within sandbox \"ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\"" Sep 12 22:29:28.237841 containerd[1508]: time="2025-09-12T22:29:28.237814688Z" level=info msg="StartContainer for \"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\"" Sep 12 22:29:28.238580 containerd[1508]: time="2025-09-12T22:29:28.238547544Z" level=info msg="connecting to shim 5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b" address="unix:///run/containerd/s/c51b5150e11d11f42cb13b7eb853b256a02a19cf159141f3f7fad8190b505716" protocol=ttrpc version=3 Sep 12 22:29:28.265821 systemd[1]: Started cri-containerd-5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b.scope - libcontainer container 5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b. Sep 12 22:29:28.289355 containerd[1508]: time="2025-09-12T22:29:28.289281131Z" level=info msg="StartContainer for \"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\" returns successfully" Sep 12 22:29:28.577386 kubelet[2658]: I0912 22:29:28.577226 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-tq275" podStartSLOduration=1.5143490210000001 podStartE2EDuration="3.577067507s" podCreationTimestamp="2025-09-12 22:29:25 +0000 UTC" firstStartedPulling="2025-09-12 22:29:26.160983207 +0000 UTC m=+6.719241948" lastFinishedPulling="2025-09-12 22:29:28.223701733 +0000 UTC m=+8.781960434" observedRunningTime="2025-09-12 22:29:28.576959139 +0000 UTC m=+9.135217880" watchObservedRunningTime="2025-09-12 22:29:28.577067507 +0000 UTC m=+9.135326248" Sep 12 22:29:30.350920 systemd[1]: cri-containerd-5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b.scope: Deactivated successfully. Sep 12 22:29:30.403349 containerd[1508]: time="2025-09-12T22:29:30.403244965Z" level=info msg="received exit event container_id:\"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\" id:\"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\" pid:2978 exit_status:1 exited_at:{seconds:1757716170 nanos:389640875}" Sep 12 22:29:30.403349 containerd[1508]: time="2025-09-12T22:29:30.403345492Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\" id:\"5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b\" pid:2978 exit_status:1 exited_at:{seconds:1757716170 nanos:389640875}" Sep 12 22:29:30.486158 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b-rootfs.mount: Deactivated successfully. Sep 12 22:29:31.569220 kubelet[2658]: I0912 22:29:31.569189 2658 scope.go:117] "RemoveContainer" containerID="5ade2aee2faaf5e6c09ac1e1d503bfb6ccd1788428de0ad61ea7b18959301d7b" Sep 12 22:29:31.573947 containerd[1508]: time="2025-09-12T22:29:31.573908695Z" level=info msg="CreateContainer within sandbox \"ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 22:29:31.582697 containerd[1508]: time="2025-09-12T22:29:31.580505322Z" level=info msg="Container 2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:31.590299 containerd[1508]: time="2025-09-12T22:29:31.590265114Z" level=info msg="CreateContainer within sandbox \"ecdcf33c974e14f799cd4012db6f2add33bc7c26a6e6bbbb5ed05acf04400f99\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17\"" Sep 12 22:29:31.590829 containerd[1508]: time="2025-09-12T22:29:31.590799309Z" level=info msg="StartContainer for \"2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17\"" Sep 12 22:29:31.591746 containerd[1508]: time="2025-09-12T22:29:31.591717448Z" level=info msg="connecting to shim 2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17" address="unix:///run/containerd/s/c51b5150e11d11f42cb13b7eb853b256a02a19cf159141f3f7fad8190b505716" protocol=ttrpc version=3 Sep 12 22:29:31.611834 systemd[1]: Started cri-containerd-2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17.scope - libcontainer container 2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17. Sep 12 22:29:31.638254 containerd[1508]: time="2025-09-12T22:29:31.638215900Z" level=info msg="StartContainer for \"2df07c9528c690b75015ddd4ca5dcc36a8fe02254aa11ea8fb6c0fdde57d7e17\" returns successfully" Sep 12 22:29:33.360438 sudo[1720]: pam_unix(sudo:session): session closed for user root Sep 12 22:29:33.361613 sshd[1719]: Connection closed by 10.0.0.1 port 43864 Sep 12 22:29:33.362141 sshd-session[1716]: pam_unix(sshd:session): session closed for user core Sep 12 22:29:33.365697 systemd[1]: sshd@6-10.0.0.148:22-10.0.0.1:43864.service: Deactivated successfully. Sep 12 22:29:33.369295 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:29:33.369530 systemd[1]: session-7.scope: Consumed 8.690s CPU time, 216.7M memory peak. Sep 12 22:29:33.370396 systemd-logind[1485]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:29:33.371607 systemd-logind[1485]: Removed session 7. Sep 12 22:29:35.514410 update_engine[1494]: I20250912 22:29:35.514344 1494 update_attempter.cc:509] Updating boot flags... Sep 12 22:29:38.256513 systemd[1]: Created slice kubepods-besteffort-podff3d4f1c_7c43_43ba_940a_3676d6d9c515.slice - libcontainer container kubepods-besteffort-podff3d4f1c_7c43_43ba_940a_3676d6d9c515.slice. Sep 12 22:29:38.428435 kubelet[2658]: I0912 22:29:38.428374 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ff3d4f1c-7c43-43ba-940a-3676d6d9c515-typha-certs\") pod \"calico-typha-6cf4d44dcd-ldb57\" (UID: \"ff3d4f1c-7c43-43ba-940a-3676d6d9c515\") " pod="calico-system/calico-typha-6cf4d44dcd-ldb57" Sep 12 22:29:38.428435 kubelet[2658]: I0912 22:29:38.428428 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff3d4f1c-7c43-43ba-940a-3676d6d9c515-tigera-ca-bundle\") pod \"calico-typha-6cf4d44dcd-ldb57\" (UID: \"ff3d4f1c-7c43-43ba-940a-3676d6d9c515\") " pod="calico-system/calico-typha-6cf4d44dcd-ldb57" Sep 12 22:29:38.428917 kubelet[2658]: I0912 22:29:38.428711 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hwp\" (UniqueName: \"kubernetes.io/projected/ff3d4f1c-7c43-43ba-940a-3676d6d9c515-kube-api-access-48hwp\") pod \"calico-typha-6cf4d44dcd-ldb57\" (UID: \"ff3d4f1c-7c43-43ba-940a-3676d6d9c515\") " pod="calico-system/calico-typha-6cf4d44dcd-ldb57" Sep 12 22:29:38.579794 containerd[1508]: time="2025-09-12T22:29:38.579145428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cf4d44dcd-ldb57,Uid:ff3d4f1c-7c43-43ba-940a-3676d6d9c515,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:38.580819 systemd[1]: Created slice kubepods-besteffort-pod84e20a2f_f47b_4958_b070_022240ecd62d.slice - libcontainer container kubepods-besteffort-pod84e20a2f_f47b_4958_b070_022240ecd62d.slice. Sep 12 22:29:38.637696 containerd[1508]: time="2025-09-12T22:29:38.637268702Z" level=info msg="connecting to shim cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95" address="unix:///run/containerd/s/65b60912129086bebe39bb8bbe8c319c52a8c65b5a25e948d91d0d6dceedfbe0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:38.667843 systemd[1]: Started cri-containerd-cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95.scope - libcontainer container cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95. Sep 12 22:29:38.731311 kubelet[2658]: I0912 22:29:38.731271 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-cni-net-dir\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731311 kubelet[2658]: I0912 22:29:38.731315 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-flexvol-driver-host\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731461 kubelet[2658]: I0912 22:29:38.731333 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e20a2f-f47b-4958-b070-022240ecd62d-tigera-ca-bundle\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731461 kubelet[2658]: I0912 22:29:38.731353 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-xtables-lock\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731461 kubelet[2658]: I0912 22:29:38.731369 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-cni-bin-dir\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731461 kubelet[2658]: I0912 22:29:38.731383 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-cni-log-dir\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731461 kubelet[2658]: I0912 22:29:38.731398 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-lib-modules\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731586 kubelet[2658]: I0912 22:29:38.731417 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/84e20a2f-f47b-4958-b070-022240ecd62d-node-certs\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731586 kubelet[2658]: I0912 22:29:38.731432 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnwb\" (UniqueName: \"kubernetes.io/projected/84e20a2f-f47b-4958-b070-022240ecd62d-kube-api-access-pwnwb\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731586 kubelet[2658]: I0912 22:29:38.731447 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-var-lib-calico\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731586 kubelet[2658]: I0912 22:29:38.731462 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-var-run-calico\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.731586 kubelet[2658]: I0912 22:29:38.731486 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/84e20a2f-f47b-4958-b070-022240ecd62d-policysync\") pod \"calico-node-2p8jm\" (UID: \"84e20a2f-f47b-4958-b070-022240ecd62d\") " pod="calico-system/calico-node-2p8jm" Sep 12 22:29:38.751538 containerd[1508]: time="2025-09-12T22:29:38.751419354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cf4d44dcd-ldb57,Uid:ff3d4f1c-7c43-43ba-940a-3676d6d9c515,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95\"" Sep 12 22:29:38.757709 containerd[1508]: time="2025-09-12T22:29:38.757679478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:29:38.833118 kubelet[2658]: E0912 22:29:38.832979 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.833118 kubelet[2658]: W0912 22:29:38.833002 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.833118 kubelet[2658]: E0912 22:29:38.833031 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.833347 kubelet[2658]: E0912 22:29:38.833278 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.833347 kubelet[2658]: W0912 22:29:38.833287 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.833347 kubelet[2658]: E0912 22:29:38.833297 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.842914 kubelet[2658]: E0912 22:29:38.842886 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.842914 kubelet[2658]: W0912 22:29:38.842907 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.843016 kubelet[2658]: E0912 22:29:38.842922 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.846327 kubelet[2658]: E0912 22:29:38.846212 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.846327 kubelet[2658]: W0912 22:29:38.846267 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.846327 kubelet[2658]: E0912 22:29:38.846317 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.864992 kubelet[2658]: E0912 22:29:38.864940 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-69gts" podUID="ca323b23-e1f4-4eab-9ac9-c732b46b6287" Sep 12 22:29:38.885121 containerd[1508]: time="2025-09-12T22:29:38.885084811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2p8jm,Uid:84e20a2f-f47b-4958-b070-022240ecd62d,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:38.913991 containerd[1508]: time="2025-09-12T22:29:38.913944559Z" level=info msg="connecting to shim a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd" address="unix:///run/containerd/s/3ca69c1b968f281b37752fa5d417493e81d3d89f08193ab558b1a9f292200a06" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:38.937851 systemd[1]: Started cri-containerd-a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd.scope - libcontainer container a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd. Sep 12 22:29:38.942726 kubelet[2658]: E0912 22:29:38.942697 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.942726 kubelet[2658]: W0912 22:29:38.942722 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.942850 kubelet[2658]: E0912 22:29:38.942740 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.943340 kubelet[2658]: E0912 22:29:38.943315 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.943379 kubelet[2658]: W0912 22:29:38.943349 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.943379 kubelet[2658]: E0912 22:29:38.943362 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.943754 kubelet[2658]: E0912 22:29:38.943736 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.943754 kubelet[2658]: W0912 22:29:38.943753 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.943820 kubelet[2658]: E0912 22:29:38.943765 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.944850 kubelet[2658]: E0912 22:29:38.944815 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.944850 kubelet[2658]: W0912 22:29:38.944835 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.944850 kubelet[2658]: E0912 22:29:38.944849 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.945510 kubelet[2658]: E0912 22:29:38.945480 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.945510 kubelet[2658]: W0912 22:29:38.945508 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.945579 kubelet[2658]: E0912 22:29:38.945520 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.946962 kubelet[2658]: E0912 22:29:38.946925 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.946962 kubelet[2658]: W0912 22:29:38.946944 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.946962 kubelet[2658]: E0912 22:29:38.946957 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.947742 kubelet[2658]: E0912 22:29:38.947718 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.947742 kubelet[2658]: W0912 22:29:38.947739 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.947820 kubelet[2658]: E0912 22:29:38.947751 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.948913 kubelet[2658]: E0912 22:29:38.948894 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.948913 kubelet[2658]: W0912 22:29:38.948912 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.948981 kubelet[2658]: E0912 22:29:38.948924 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.949127 kubelet[2658]: E0912 22:29:38.949109 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.949127 kubelet[2658]: W0912 22:29:38.949123 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.949174 kubelet[2658]: E0912 22:29:38.949133 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.949299 kubelet[2658]: E0912 22:29:38.949284 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.949299 kubelet[2658]: W0912 22:29:38.949297 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.949350 kubelet[2658]: E0912 22:29:38.949307 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.949469 kubelet[2658]: E0912 22:29:38.949456 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.949507 kubelet[2658]: W0912 22:29:38.949476 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.949507 kubelet[2658]: E0912 22:29:38.949496 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.949949 kubelet[2658]: E0912 22:29:38.949772 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.949949 kubelet[2658]: W0912 22:29:38.949794 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.949949 kubelet[2658]: E0912 22:29:38.949809 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.950111 kubelet[2658]: E0912 22:29:38.950098 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.950165 kubelet[2658]: W0912 22:29:38.950155 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.950217 kubelet[2658]: E0912 22:29:38.950207 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.950417 kubelet[2658]: E0912 22:29:38.950404 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.950503 kubelet[2658]: W0912 22:29:38.950480 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.950644 kubelet[2658]: E0912 22:29:38.950551 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.950751 kubelet[2658]: E0912 22:29:38.950738 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.950800 kubelet[2658]: W0912 22:29:38.950790 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.950941 kubelet[2658]: E0912 22:29:38.950853 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.951088 kubelet[2658]: E0912 22:29:38.951075 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.951280 kubelet[2658]: W0912 22:29:38.951143 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.951280 kubelet[2658]: E0912 22:29:38.951162 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.951425 kubelet[2658]: E0912 22:29:38.951393 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.951499 kubelet[2658]: W0912 22:29:38.951477 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.951551 kubelet[2658]: E0912 22:29:38.951541 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.951842 kubelet[2658]: E0912 22:29:38.951830 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.951925 kubelet[2658]: W0912 22:29:38.951912 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.951975 kubelet[2658]: E0912 22:29:38.951965 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.952277 kubelet[2658]: E0912 22:29:38.952173 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.952277 kubelet[2658]: W0912 22:29:38.952204 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.952277 kubelet[2658]: E0912 22:29:38.952215 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.952524 kubelet[2658]: E0912 22:29:38.952496 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:38.952612 kubelet[2658]: W0912 22:29:38.952585 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:38.952666 kubelet[2658]: E0912 22:29:38.952655 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:38.974174 containerd[1508]: time="2025-09-12T22:29:38.974131326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2p8jm,Uid:84e20a2f-f47b-4958-b070-022240ecd62d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\"" Sep 12 22:29:39.034015 kubelet[2658]: E0912 22:29:39.033985 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.034015 kubelet[2658]: W0912 22:29:39.034009 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.034233 kubelet[2658]: E0912 22:29:39.034028 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.034233 kubelet[2658]: I0912 22:29:39.034056 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca323b23-e1f4-4eab-9ac9-c732b46b6287-kubelet-dir\") pod \"csi-node-driver-69gts\" (UID: \"ca323b23-e1f4-4eab-9ac9-c732b46b6287\") " pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:39.034233 kubelet[2658]: E0912 22:29:39.034217 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.034233 kubelet[2658]: W0912 22:29:39.034227 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.034233 kubelet[2658]: E0912 22:29:39.034240 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.034233 kubelet[2658]: I0912 22:29:39.034255 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmsn\" (UniqueName: \"kubernetes.io/projected/ca323b23-e1f4-4eab-9ac9-c732b46b6287-kube-api-access-4xmsn\") pod \"csi-node-driver-69gts\" (UID: \"ca323b23-e1f4-4eab-9ac9-c732b46b6287\") " pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:39.034781 kubelet[2658]: E0912 22:29:39.034663 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.034781 kubelet[2658]: W0912 22:29:39.034712 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.034781 kubelet[2658]: E0912 22:29:39.034734 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.035022 kubelet[2658]: E0912 22:29:39.035008 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.035133 kubelet[2658]: W0912 22:29:39.035056 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.035133 kubelet[2658]: E0912 22:29:39.035078 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.035246 kubelet[2658]: E0912 22:29:39.035229 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.035246 kubelet[2658]: W0912 22:29:39.035243 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.035301 kubelet[2658]: E0912 22:29:39.035262 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.035301 kubelet[2658]: I0912 22:29:39.035287 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca323b23-e1f4-4eab-9ac9-c732b46b6287-registration-dir\") pod \"csi-node-driver-69gts\" (UID: \"ca323b23-e1f4-4eab-9ac9-c732b46b6287\") " pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:39.035481 kubelet[2658]: E0912 22:29:39.035469 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.035481 kubelet[2658]: W0912 22:29:39.035481 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.035565 kubelet[2658]: E0912 22:29:39.035543 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.035596 kubelet[2658]: I0912 22:29:39.035575 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca323b23-e1f4-4eab-9ac9-c732b46b6287-socket-dir\") pod \"csi-node-driver-69gts\" (UID: \"ca323b23-e1f4-4eab-9ac9-c732b46b6287\") " pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:39.035657 kubelet[2658]: E0912 22:29:39.035645 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.035701 kubelet[2658]: W0912 22:29:39.035657 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.035753 kubelet[2658]: E0912 22:29:39.035735 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.035908 kubelet[2658]: E0912 22:29:39.035896 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.035990 kubelet[2658]: W0912 22:29:39.035908 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.035990 kubelet[2658]: E0912 22:29:39.035921 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036075 kubelet[2658]: E0912 22:29:39.036062 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036075 kubelet[2658]: W0912 22:29:39.036073 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036139 kubelet[2658]: E0912 22:29:39.036084 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036139 kubelet[2658]: I0912 22:29:39.036104 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ca323b23-e1f4-4eab-9ac9-c732b46b6287-varrun\") pod \"csi-node-driver-69gts\" (UID: \"ca323b23-e1f4-4eab-9ac9-c732b46b6287\") " pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:39.036253 kubelet[2658]: E0912 22:29:39.036241 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036253 kubelet[2658]: W0912 22:29:39.036252 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036306 kubelet[2658]: E0912 22:29:39.036269 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036407 kubelet[2658]: E0912 22:29:39.036396 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036407 kubelet[2658]: W0912 22:29:39.036407 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036456 kubelet[2658]: E0912 22:29:39.036414 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036606 kubelet[2658]: E0912 22:29:39.036594 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036606 kubelet[2658]: W0912 22:29:39.036605 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036648 kubelet[2658]: E0912 22:29:39.036622 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036776 kubelet[2658]: E0912 22:29:39.036764 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036776 kubelet[2658]: W0912 22:29:39.036775 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036828 kubelet[2658]: E0912 22:29:39.036782 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.036953 kubelet[2658]: E0912 22:29:39.036941 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.036953 kubelet[2658]: W0912 22:29:39.036951 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.036994 kubelet[2658]: E0912 22:29:39.036959 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.037118 kubelet[2658]: E0912 22:29:39.037106 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.037118 kubelet[2658]: W0912 22:29:39.037117 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.037165 kubelet[2658]: E0912 22:29:39.037125 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.137478 kubelet[2658]: E0912 22:29:39.137381 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.137478 kubelet[2658]: W0912 22:29:39.137406 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.137478 kubelet[2658]: E0912 22:29:39.137438 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.137905 kubelet[2658]: E0912 22:29:39.137883 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.137905 kubelet[2658]: W0912 22:29:39.137899 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.137979 kubelet[2658]: E0912 22:29:39.137915 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.138191 kubelet[2658]: E0912 22:29:39.138166 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.138314 kubelet[2658]: W0912 22:29:39.138295 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.138350 kubelet[2658]: E0912 22:29:39.138319 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.138650 kubelet[2658]: E0912 22:29:39.138619 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.138650 kubelet[2658]: W0912 22:29:39.138636 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.138767 kubelet[2658]: E0912 22:29:39.138658 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.138851 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139531 kubelet[2658]: W0912 22:29:39.138864 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.138877 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.139132 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139531 kubelet[2658]: W0912 22:29:39.139142 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.139159 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.139340 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139531 kubelet[2658]: W0912 22:29:39.139348 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139531 kubelet[2658]: E0912 22:29:39.139363 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139772 kubelet[2658]: E0912 22:29:39.139568 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139772 kubelet[2658]: W0912 22:29:39.139577 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139772 kubelet[2658]: E0912 22:29:39.139622 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139834 kubelet[2658]: E0912 22:29:39.139793 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139834 kubelet[2658]: W0912 22:29:39.139801 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139871 kubelet[2658]: E0912 22:29:39.139855 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.139952 kubelet[2658]: E0912 22:29:39.139923 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.139952 kubelet[2658]: W0912 22:29:39.139933 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.139952 kubelet[2658]: E0912 22:29:39.139950 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.140470 kubelet[2658]: E0912 22:29:39.140069 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.140470 kubelet[2658]: W0912 22:29:39.140076 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.140470 kubelet[2658]: E0912 22:29:39.140089 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.140470 kubelet[2658]: E0912 22:29:39.140204 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.140470 kubelet[2658]: W0912 22:29:39.140210 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.140470 kubelet[2658]: E0912 22:29:39.140226 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.140470 kubelet[2658]: E0912 22:29:39.140425 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.140470 kubelet[2658]: W0912 22:29:39.140464 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141083 kubelet[2658]: E0912 22:29:39.140482 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.141083 kubelet[2658]: E0912 22:29:39.140706 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.141083 kubelet[2658]: W0912 22:29:39.140716 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141083 kubelet[2658]: E0912 22:29:39.140733 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.141246 kubelet[2658]: E0912 22:29:39.141226 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.141246 kubelet[2658]: W0912 22:29:39.141245 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141294 kubelet[2658]: E0912 22:29:39.141261 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.141460 kubelet[2658]: E0912 22:29:39.141427 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.141460 kubelet[2658]: W0912 22:29:39.141442 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141532 kubelet[2658]: E0912 22:29:39.141478 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.141627 kubelet[2658]: E0912 22:29:39.141602 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.141627 kubelet[2658]: W0912 22:29:39.141616 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141771 kubelet[2658]: E0912 22:29:39.141722 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.141892 kubelet[2658]: E0912 22:29:39.141848 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.141892 kubelet[2658]: W0912 22:29:39.141865 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.141892 kubelet[2658]: E0912 22:29:39.141877 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.142033 kubelet[2658]: E0912 22:29:39.142018 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.142033 kubelet[2658]: W0912 22:29:39.142030 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.142083 kubelet[2658]: E0912 22:29:39.142039 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.142167 kubelet[2658]: E0912 22:29:39.142156 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.142197 kubelet[2658]: W0912 22:29:39.142167 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.142288 kubelet[2658]: E0912 22:29:39.142245 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.142358 kubelet[2658]: E0912 22:29:39.142347 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.142358 kubelet[2658]: W0912 22:29:39.142356 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.142404 kubelet[2658]: E0912 22:29:39.142367 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.142571 kubelet[2658]: E0912 22:29:39.142558 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.142571 kubelet[2658]: W0912 22:29:39.142569 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.142801 kubelet[2658]: E0912 22:29:39.142767 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.142888 kubelet[2658]: E0912 22:29:39.142876 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.142919 kubelet[2658]: W0912 22:29:39.142887 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.142919 kubelet[2658]: E0912 22:29:39.142904 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.143096 kubelet[2658]: E0912 22:29:39.143082 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.143096 kubelet[2658]: W0912 22:29:39.143094 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.143176 kubelet[2658]: E0912 22:29:39.143112 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.143457 kubelet[2658]: E0912 22:29:39.143428 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.143457 kubelet[2658]: W0912 22:29:39.143442 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.143457 kubelet[2658]: E0912 22:29:39.143452 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.154612 kubelet[2658]: E0912 22:29:39.154580 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:39.154612 kubelet[2658]: W0912 22:29:39.154600 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:39.154612 kubelet[2658]: E0912 22:29:39.154613 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:39.717450 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount915176974.mount: Deactivated successfully. Sep 12 22:29:40.383570 containerd[1508]: time="2025-09-12T22:29:40.383514117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:40.384414 containerd[1508]: time="2025-09-12T22:29:40.384032899Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 22:29:40.385268 containerd[1508]: time="2025-09-12T22:29:40.385220267Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:40.387284 containerd[1508]: time="2025-09-12T22:29:40.387214230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:40.388979 containerd[1508]: time="2025-09-12T22:29:40.388948781Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.631235622s" Sep 12 22:29:40.389087 containerd[1508]: time="2025-09-12T22:29:40.389072546Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 22:29:40.390767 containerd[1508]: time="2025-09-12T22:29:40.390744895Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:29:40.403922 containerd[1508]: time="2025-09-12T22:29:40.403886476Z" level=info msg="CreateContainer within sandbox \"cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:29:40.409698 containerd[1508]: time="2025-09-12T22:29:40.409529149Z" level=info msg="Container 83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:40.415636 containerd[1508]: time="2025-09-12T22:29:40.415596878Z" level=info msg="CreateContainer within sandbox \"cc168d8805bb3fc2369b4a336ad151d59e544bfe3114badde249b4a92e96bd95\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15\"" Sep 12 22:29:40.416318 containerd[1508]: time="2025-09-12T22:29:40.416276506Z" level=info msg="StartContainer for \"83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15\"" Sep 12 22:29:40.417331 containerd[1508]: time="2025-09-12T22:29:40.417299469Z" level=info msg="connecting to shim 83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15" address="unix:///run/containerd/s/65b60912129086bebe39bb8bbe8c319c52a8c65b5a25e948d91d0d6dceedfbe0" protocol=ttrpc version=3 Sep 12 22:29:40.441875 systemd[1]: Started cri-containerd-83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15.scope - libcontainer container 83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15. Sep 12 22:29:40.488388 containerd[1508]: time="2025-09-12T22:29:40.488346595Z" level=info msg="StartContainer for \"83d60245fde7c9aa915423e4c6e13f9965796f48a39a733daccfb8a7dbc0ff15\" returns successfully" Sep 12 22:29:40.510066 kubelet[2658]: E0912 22:29:40.510014 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-69gts" podUID="ca323b23-e1f4-4eab-9ac9-c732b46b6287" Sep 12 22:29:40.663455 kubelet[2658]: E0912 22:29:40.663355 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.663455 kubelet[2658]: W0912 22:29:40.663383 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.663455 kubelet[2658]: E0912 22:29:40.663404 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.663625 kubelet[2658]: E0912 22:29:40.663571 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.663625 kubelet[2658]: W0912 22:29:40.663579 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.663625 kubelet[2658]: E0912 22:29:40.663587 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.663753 kubelet[2658]: E0912 22:29:40.663726 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.663753 kubelet[2658]: W0912 22:29:40.663735 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.663753 kubelet[2658]: E0912 22:29:40.663744 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.663886 kubelet[2658]: E0912 22:29:40.663875 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.663915 kubelet[2658]: W0912 22:29:40.663887 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.663915 kubelet[2658]: E0912 22:29:40.663896 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664078 kubelet[2658]: E0912 22:29:40.664019 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664078 kubelet[2658]: W0912 22:29:40.664029 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664078 kubelet[2658]: E0912 22:29:40.664037 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664353 kubelet[2658]: E0912 22:29:40.664151 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664353 kubelet[2658]: W0912 22:29:40.664159 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664353 kubelet[2658]: E0912 22:29:40.664166 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664353 kubelet[2658]: E0912 22:29:40.664292 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664353 kubelet[2658]: W0912 22:29:40.664299 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664353 kubelet[2658]: E0912 22:29:40.664307 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664594 kubelet[2658]: E0912 22:29:40.664443 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664594 kubelet[2658]: W0912 22:29:40.664450 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664594 kubelet[2658]: E0912 22:29:40.664457 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664594 kubelet[2658]: E0912 22:29:40.664594 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664766 kubelet[2658]: W0912 22:29:40.664601 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664766 kubelet[2658]: E0912 22:29:40.664609 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664766 kubelet[2658]: E0912 22:29:40.664732 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664766 kubelet[2658]: W0912 22:29:40.664739 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664766 kubelet[2658]: E0912 22:29:40.664746 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664982 kubelet[2658]: E0912 22:29:40.664864 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.664982 kubelet[2658]: W0912 22:29:40.664871 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.664982 kubelet[2658]: E0912 22:29:40.664878 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.664982 kubelet[2658]: E0912 22:29:40.664993 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.665207 kubelet[2658]: W0912 22:29:40.665000 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.665207 kubelet[2658]: E0912 22:29:40.665007 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.665207 kubelet[2658]: E0912 22:29:40.665146 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.665207 kubelet[2658]: W0912 22:29:40.665153 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.665207 kubelet[2658]: E0912 22:29:40.665162 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.665424 kubelet[2658]: E0912 22:29:40.665282 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.665424 kubelet[2658]: W0912 22:29:40.665289 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.665424 kubelet[2658]: E0912 22:29:40.665295 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.665424 kubelet[2658]: E0912 22:29:40.665404 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.665424 kubelet[2658]: W0912 22:29:40.665410 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.665424 kubelet[2658]: E0912 22:29:40.665417 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.751520 kubelet[2658]: E0912 22:29:40.751474 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.751520 kubelet[2658]: W0912 22:29:40.751504 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.751520 kubelet[2658]: E0912 22:29:40.751524 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.752105 kubelet[2658]: E0912 22:29:40.752085 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.752164 kubelet[2658]: W0912 22:29:40.752118 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.752164 kubelet[2658]: E0912 22:29:40.752138 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.752565 kubelet[2658]: E0912 22:29:40.752540 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.752565 kubelet[2658]: W0912 22:29:40.752564 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.752644 kubelet[2658]: E0912 22:29:40.752583 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.753202 kubelet[2658]: E0912 22:29:40.753183 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.753202 kubelet[2658]: W0912 22:29:40.753197 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.753202 kubelet[2658]: E0912 22:29:40.753214 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.753522 kubelet[2658]: E0912 22:29:40.753505 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.753522 kubelet[2658]: W0912 22:29:40.753521 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.753661 kubelet[2658]: E0912 22:29:40.753640 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.754124 kubelet[2658]: E0912 22:29:40.753894 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.754124 kubelet[2658]: W0912 22:29:40.754120 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.754329 kubelet[2658]: E0912 22:29:40.754296 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.754517 kubelet[2658]: E0912 22:29:40.754503 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.754546 kubelet[2658]: W0912 22:29:40.754522 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.754585 kubelet[2658]: E0912 22:29:40.754550 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.754713 kubelet[2658]: E0912 22:29:40.754701 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.754713 kubelet[2658]: W0912 22:29:40.754713 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.754771 kubelet[2658]: E0912 22:29:40.754728 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.754908 kubelet[2658]: E0912 22:29:40.754896 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.754908 kubelet[2658]: W0912 22:29:40.754907 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.754960 kubelet[2658]: E0912 22:29:40.754926 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.755174 kubelet[2658]: E0912 22:29:40.755160 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.755214 kubelet[2658]: W0912 22:29:40.755200 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.755241 kubelet[2658]: E0912 22:29:40.755221 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.755418 kubelet[2658]: E0912 22:29:40.755406 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.755418 kubelet[2658]: W0912 22:29:40.755418 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.755494 kubelet[2658]: E0912 22:29:40.755428 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.755631 kubelet[2658]: E0912 22:29:40.755619 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.755631 kubelet[2658]: W0912 22:29:40.755631 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.755863 kubelet[2658]: E0912 22:29:40.755710 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.755863 kubelet[2658]: E0912 22:29:40.755804 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.755863 kubelet[2658]: W0912 22:29:40.755813 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756252 kubelet[2658]: E0912 22:29:40.755947 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.756252 kubelet[2658]: W0912 22:29:40.755958 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756252 kubelet[2658]: E0912 22:29:40.755967 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.756252 kubelet[2658]: E0912 22:29:40.756006 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.756252 kubelet[2658]: E0912 22:29:40.756145 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.756252 kubelet[2658]: W0912 22:29:40.756153 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756252 kubelet[2658]: E0912 22:29:40.756162 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.756451 kubelet[2658]: E0912 22:29:40.756307 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.756451 kubelet[2658]: W0912 22:29:40.756315 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756451 kubelet[2658]: E0912 22:29:40.756323 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.756689 kubelet[2658]: E0912 22:29:40.756658 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.756689 kubelet[2658]: W0912 22:29:40.756688 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756764 kubelet[2658]: E0912 22:29:40.756700 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:40.756862 kubelet[2658]: E0912 22:29:40.756851 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:29:40.756891 kubelet[2658]: W0912 22:29:40.756861 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:29:40.756891 kubelet[2658]: E0912 22:29:40.756870 2658 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:29:41.321874 containerd[1508]: time="2025-09-12T22:29:41.321826919Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:41.322505 containerd[1508]: time="2025-09-12T22:29:41.322471185Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 22:29:41.323410 containerd[1508]: time="2025-09-12T22:29:41.323172692Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:41.325206 containerd[1508]: time="2025-09-12T22:29:41.325177771Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:41.325835 containerd[1508]: time="2025-09-12T22:29:41.325797355Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 934.777809ms" Sep 12 22:29:41.325998 containerd[1508]: time="2025-09-12T22:29:41.325842877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 22:29:41.327514 containerd[1508]: time="2025-09-12T22:29:41.327403498Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:29:41.336843 containerd[1508]: time="2025-09-12T22:29:41.335837950Z" level=info msg="Container 1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:41.343374 containerd[1508]: time="2025-09-12T22:29:41.343335925Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\"" Sep 12 22:29:41.344145 containerd[1508]: time="2025-09-12T22:29:41.344115995Z" level=info msg="StartContainer for \"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\"" Sep 12 22:29:41.345846 containerd[1508]: time="2025-09-12T22:29:41.345821943Z" level=info msg="connecting to shim 1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b" address="unix:///run/containerd/s/3ca69c1b968f281b37752fa5d417493e81d3d89f08193ab558b1a9f292200a06" protocol=ttrpc version=3 Sep 12 22:29:41.371016 systemd[1]: Started cri-containerd-1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b.scope - libcontainer container 1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b. Sep 12 22:29:41.421738 systemd[1]: cri-containerd-1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b.scope: Deactivated successfully. Sep 12 22:29:41.425009 containerd[1508]: time="2025-09-12T22:29:41.424965454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\" id:\"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\" pid:3396 exited_at:{seconds:1757716181 nanos:424550477}" Sep 12 22:29:41.439765 containerd[1508]: time="2025-09-12T22:29:41.439731234Z" level=info msg="received exit event container_id:\"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\" id:\"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\" pid:3396 exited_at:{seconds:1757716181 nanos:424550477}" Sep 12 22:29:41.442507 containerd[1508]: time="2025-09-12T22:29:41.442473542Z" level=info msg="StartContainer for \"1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b\" returns successfully" Sep 12 22:29:41.463131 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1376c79f6672d1475060c07349e8b47523de0de4848488a2b0cbff953cf9876b-rootfs.mount: Deactivated successfully. Sep 12 22:29:41.595824 kubelet[2658]: I0912 22:29:41.595727 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:29:41.596757 containerd[1508]: time="2025-09-12T22:29:41.596724366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:29:41.611457 kubelet[2658]: I0912 22:29:41.611399 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6cf4d44dcd-ldb57" podStartSLOduration=1.976882979 podStartE2EDuration="3.611379862s" podCreationTimestamp="2025-09-12 22:29:38 +0000 UTC" firstStartedPulling="2025-09-12 22:29:38.755838515 +0000 UTC m=+19.314097256" lastFinishedPulling="2025-09-12 22:29:40.390335398 +0000 UTC m=+20.948594139" observedRunningTime="2025-09-12 22:29:40.603122322 +0000 UTC m=+21.161381063" watchObservedRunningTime="2025-09-12 22:29:41.611379862 +0000 UTC m=+22.169638603" Sep 12 22:29:42.510634 kubelet[2658]: E0912 22:29:42.510582 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-69gts" podUID="ca323b23-e1f4-4eab-9ac9-c732b46b6287" Sep 12 22:29:44.510356 kubelet[2658]: E0912 22:29:44.510312 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-69gts" podUID="ca323b23-e1f4-4eab-9ac9-c732b46b6287" Sep 12 22:29:45.155413 containerd[1508]: time="2025-09-12T22:29:45.154793681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:45.155413 containerd[1508]: time="2025-09-12T22:29:45.155384461Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 22:29:45.156281 containerd[1508]: time="2025-09-12T22:29:45.156258689Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:45.158215 containerd[1508]: time="2025-09-12T22:29:45.158175672Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:45.159105 containerd[1508]: time="2025-09-12T22:29:45.159077222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.562311055s" Sep 12 22:29:45.159167 containerd[1508]: time="2025-09-12T22:29:45.159108463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 22:29:45.161112 containerd[1508]: time="2025-09-12T22:29:45.161043047Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:29:45.169701 containerd[1508]: time="2025-09-12T22:29:45.169643010Z" level=info msg="Container 6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:45.170933 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount33027953.mount: Deactivated successfully. Sep 12 22:29:45.178580 containerd[1508]: time="2025-09-12T22:29:45.178530103Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\"" Sep 12 22:29:45.179154 containerd[1508]: time="2025-09-12T22:29:45.179105641Z" level=info msg="StartContainer for \"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\"" Sep 12 22:29:45.181087 containerd[1508]: time="2025-09-12T22:29:45.181045345Z" level=info msg="connecting to shim 6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099" address="unix:///run/containerd/s/3ca69c1b968f281b37752fa5d417493e81d3d89f08193ab558b1a9f292200a06" protocol=ttrpc version=3 Sep 12 22:29:45.209842 systemd[1]: Started cri-containerd-6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099.scope - libcontainer container 6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099. Sep 12 22:29:45.239320 containerd[1508]: time="2025-09-12T22:29:45.239275982Z" level=info msg="StartContainer for \"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\" returns successfully" Sep 12 22:29:45.789291 systemd[1]: cri-containerd-6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099.scope: Deactivated successfully. Sep 12 22:29:45.789760 systemd[1]: cri-containerd-6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099.scope: Consumed 470ms CPU time, 178.7M memory peak, 2.6M read from disk, 165.8M written to disk. Sep 12 22:29:45.791757 containerd[1508]: time="2025-09-12T22:29:45.791715329Z" level=info msg="received exit event container_id:\"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\" id:\"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\" pid:3456 exited_at:{seconds:1757716185 nanos:791509122}" Sep 12 22:29:45.792900 containerd[1508]: time="2025-09-12T22:29:45.791802612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\" id:\"6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099\" pid:3456 exited_at:{seconds:1757716185 nanos:791509122}" Sep 12 22:29:45.809364 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6812fcb48e757da9108baf56dfa166626978c2c4f273af2278ce4a54adf91099-rootfs.mount: Deactivated successfully. Sep 12 22:29:45.827480 kubelet[2658]: I0912 22:29:45.827452 2658 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 22:29:45.937970 systemd[1]: Created slice kubepods-burstable-podf8fbdb4d_699f_4d60_8d98_dfc00c427c9f.slice - libcontainer container kubepods-burstable-podf8fbdb4d_699f_4d60_8d98_dfc00c427c9f.slice. Sep 12 22:29:45.946060 systemd[1]: Created slice kubepods-besteffort-podca9d44dd_f762_4c01_95d5_c6ef563e914c.slice - libcontainer container kubepods-besteffort-podca9d44dd_f762_4c01_95d5_c6ef563e914c.slice. Sep 12 22:29:45.953711 systemd[1]: Created slice kubepods-burstable-podf8f746ec_9046_44a0_81cf_c7041d7340b7.slice - libcontainer container kubepods-burstable-podf8f746ec_9046_44a0_81cf_c7041d7340b7.slice. Sep 12 22:29:45.958957 systemd[1]: Created slice kubepods-besteffort-pod279e3805_39cc_475e_8130_1c4b4a1b7121.slice - libcontainer container kubepods-besteffort-pod279e3805_39cc_475e_8130_1c4b4a1b7121.slice. Sep 12 22:29:45.963795 systemd[1]: Created slice kubepods-besteffort-pod9f1b6925_b20d_4f01_9ee0_8b249e16ce97.slice - libcontainer container kubepods-besteffort-pod9f1b6925_b20d_4f01_9ee0_8b249e16ce97.slice. Sep 12 22:29:45.977970 systemd[1]: Created slice kubepods-besteffort-pode026fd86_5133_41c6_a136_1b3785615a66.slice - libcontainer container kubepods-besteffort-pode026fd86_5133_41c6_a136_1b3785615a66.slice. Sep 12 22:29:45.983195 systemd[1]: Created slice kubepods-besteffort-pod86cb6c26_0c34_4ae2_b007_31748a17a456.slice - libcontainer container kubepods-besteffort-pod86cb6c26_0c34_4ae2_b007_31748a17a456.slice. Sep 12 22:29:46.092054 kubelet[2658]: I0912 22:29:46.091921 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959pb\" (UniqueName: \"kubernetes.io/projected/ca9d44dd-f762-4c01-95d5-c6ef563e914c-kube-api-access-959pb\") pod \"calico-kube-controllers-76fdc566d5-xfzhc\" (UID: \"ca9d44dd-f762-4c01-95d5-c6ef563e914c\") " pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" Sep 12 22:29:46.092054 kubelet[2658]: I0912 22:29:46.091990 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-backend-key-pair\") pod \"whisker-858d97f957-kmmqb\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " pod="calico-system/whisker-858d97f957-kmmqb" Sep 12 22:29:46.092054 kubelet[2658]: I0912 22:29:46.092025 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca9d44dd-f762-4c01-95d5-c6ef563e914c-tigera-ca-bundle\") pod \"calico-kube-controllers-76fdc566d5-xfzhc\" (UID: \"ca9d44dd-f762-4c01-95d5-c6ef563e914c\") " pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" Sep 12 22:29:46.092206 kubelet[2658]: I0912 22:29:46.092066 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/86cb6c26-0c34-4ae2-b007-31748a17a456-calico-apiserver-certs\") pod \"calico-apiserver-57f645bfdb-m9hn4\" (UID: \"86cb6c26-0c34-4ae2-b007-31748a17a456\") " pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" Sep 12 22:29:46.092206 kubelet[2658]: I0912 22:29:46.092086 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e026fd86-5133-41c6-a136-1b3785615a66-calico-apiserver-certs\") pod \"calico-apiserver-57f645bfdb-zb94s\" (UID: \"e026fd86-5133-41c6-a136-1b3785615a66\") " pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" Sep 12 22:29:46.092206 kubelet[2658]: I0912 22:29:46.092110 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-ca-bundle\") pod \"whisker-858d97f957-kmmqb\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " pod="calico-system/whisker-858d97f957-kmmqb" Sep 12 22:29:46.092206 kubelet[2658]: I0912 22:29:46.092125 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/279e3805-39cc-475e-8130-1c4b4a1b7121-goldmane-ca-bundle\") pod \"goldmane-7988f88666-2d4dt\" (UID: \"279e3805-39cc-475e-8130-1c4b4a1b7121\") " pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.092206 kubelet[2658]: I0912 22:29:46.092144 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8fbdb4d-699f-4d60-8d98-dfc00c427c9f-config-volume\") pod \"coredns-7c65d6cfc9-vgc9k\" (UID: \"f8fbdb4d-699f-4d60-8d98-dfc00c427c9f\") " pod="kube-system/coredns-7c65d6cfc9-vgc9k" Sep 12 22:29:46.092316 kubelet[2658]: I0912 22:29:46.092180 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqcs8\" (UniqueName: \"kubernetes.io/projected/f8fbdb4d-699f-4d60-8d98-dfc00c427c9f-kube-api-access-wqcs8\") pod \"coredns-7c65d6cfc9-vgc9k\" (UID: \"f8fbdb4d-699f-4d60-8d98-dfc00c427c9f\") " pod="kube-system/coredns-7c65d6cfc9-vgc9k" Sep 12 22:29:46.092316 kubelet[2658]: I0912 22:29:46.092233 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7f5\" (UniqueName: \"kubernetes.io/projected/e026fd86-5133-41c6-a136-1b3785615a66-kube-api-access-bd7f5\") pod \"calico-apiserver-57f645bfdb-zb94s\" (UID: \"e026fd86-5133-41c6-a136-1b3785615a66\") " pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" Sep 12 22:29:46.092316 kubelet[2658]: I0912 22:29:46.092273 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8f746ec-9046-44a0-81cf-c7041d7340b7-config-volume\") pod \"coredns-7c65d6cfc9-qbgs5\" (UID: \"f8f746ec-9046-44a0-81cf-c7041d7340b7\") " pod="kube-system/coredns-7c65d6cfc9-qbgs5" Sep 12 22:29:46.092316 kubelet[2658]: I0912 22:29:46.092301 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/279e3805-39cc-475e-8130-1c4b4a1b7121-goldmane-key-pair\") pod \"goldmane-7988f88666-2d4dt\" (UID: \"279e3805-39cc-475e-8130-1c4b4a1b7121\") " pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.092416 kubelet[2658]: I0912 22:29:46.092318 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4gpp\" (UniqueName: \"kubernetes.io/projected/86cb6c26-0c34-4ae2-b007-31748a17a456-kube-api-access-t4gpp\") pod \"calico-apiserver-57f645bfdb-m9hn4\" (UID: \"86cb6c26-0c34-4ae2-b007-31748a17a456\") " pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" Sep 12 22:29:46.092416 kubelet[2658]: I0912 22:29:46.092362 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzj9\" (UniqueName: \"kubernetes.io/projected/279e3805-39cc-475e-8130-1c4b4a1b7121-kube-api-access-mgzj9\") pod \"goldmane-7988f88666-2d4dt\" (UID: \"279e3805-39cc-475e-8130-1c4b4a1b7121\") " pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.092416 kubelet[2658]: I0912 22:29:46.092379 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpdhf\" (UniqueName: \"kubernetes.io/projected/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-kube-api-access-qpdhf\") pod \"whisker-858d97f957-kmmqb\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " pod="calico-system/whisker-858d97f957-kmmqb" Sep 12 22:29:46.092416 kubelet[2658]: I0912 22:29:46.092396 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7l4h\" (UniqueName: \"kubernetes.io/projected/f8f746ec-9046-44a0-81cf-c7041d7340b7-kube-api-access-p7l4h\") pod \"coredns-7c65d6cfc9-qbgs5\" (UID: \"f8f746ec-9046-44a0-81cf-c7041d7340b7\") " pod="kube-system/coredns-7c65d6cfc9-qbgs5" Sep 12 22:29:46.092416 kubelet[2658]: I0912 22:29:46.092412 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279e3805-39cc-475e-8130-1c4b4a1b7121-config\") pod \"goldmane-7988f88666-2d4dt\" (UID: \"279e3805-39cc-475e-8130-1c4b4a1b7121\") " pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.245152 containerd[1508]: time="2025-09-12T22:29:46.245116565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vgc9k,Uid:f8fbdb4d-699f-4d60-8d98-dfc00c427c9f,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:46.248700 containerd[1508]: time="2025-09-12T22:29:46.248640836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76fdc566d5-xfzhc,Uid:ca9d44dd-f762-4c01-95d5-c6ef563e914c,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:46.258493 containerd[1508]: time="2025-09-12T22:29:46.258452866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qbgs5,Uid:f8f746ec-9046-44a0-81cf-c7041d7340b7,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:46.262149 containerd[1508]: time="2025-09-12T22:29:46.262104541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2d4dt,Uid:279e3805-39cc-475e-8130-1c4b4a1b7121,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:46.268436 containerd[1508]: time="2025-09-12T22:29:46.268392259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-858d97f957-kmmqb,Uid:9f1b6925-b20d-4f01-9ee0-8b249e16ce97,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:46.281539 containerd[1508]: time="2025-09-12T22:29:46.281185823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-zb94s,Uid:e026fd86-5133-41c6-a136-1b3785615a66,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:29:46.287581 containerd[1508]: time="2025-09-12T22:29:46.287549824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-m9hn4,Uid:86cb6c26-0c34-4ae2-b007-31748a17a456,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:29:46.351933 containerd[1508]: time="2025-09-12T22:29:46.351817852Z" level=error msg="Failed to destroy network for sandbox \"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.354775 containerd[1508]: time="2025-09-12T22:29:46.354718504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vgc9k,Uid:f8fbdb4d-699f-4d60-8d98-dfc00c427c9f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.356970 kubelet[2658]: E0912 22:29:46.356914 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.357138 kubelet[2658]: E0912 22:29:46.357105 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vgc9k" Sep 12 22:29:46.357241 kubelet[2658]: E0912 22:29:46.357204 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vgc9k" Sep 12 22:29:46.357333 kubelet[2658]: E0912 22:29:46.357309 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-vgc9k_kube-system(f8fbdb4d-699f-4d60-8d98-dfc00c427c9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-vgc9k_kube-system(f8fbdb4d-699f-4d60-8d98-dfc00c427c9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e755a4139e2d6786bd6cbde29e217f95aa3b35fd2be1a2a356ee8b0fcffdd7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-vgc9k" podUID="f8fbdb4d-699f-4d60-8d98-dfc00c427c9f" Sep 12 22:29:46.359183 containerd[1508]: time="2025-09-12T22:29:46.359148204Z" level=error msg="Failed to destroy network for sandbox \"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.359810 containerd[1508]: time="2025-09-12T22:29:46.359777504Z" level=error msg="Failed to destroy network for sandbox \"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.360201 containerd[1508]: time="2025-09-12T22:29:46.360169036Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76fdc566d5-xfzhc,Uid:ca9d44dd-f762-4c01-95d5-c6ef563e914c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.360453 kubelet[2658]: E0912 22:29:46.360416 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.360567 kubelet[2658]: E0912 22:29:46.360551 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" Sep 12 22:29:46.360693 kubelet[2658]: E0912 22:29:46.360625 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" Sep 12 22:29:46.360809 kubelet[2658]: E0912 22:29:46.360785 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76fdc566d5-xfzhc_calico-system(ca9d44dd-f762-4c01-95d5-c6ef563e914c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76fdc566d5-xfzhc_calico-system(ca9d44dd-f762-4c01-95d5-c6ef563e914c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2253adaf54112be121b496d150af1e5316ed8339a848a8cecec884e3aa8043c5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" podUID="ca9d44dd-f762-4c01-95d5-c6ef563e914c" Sep 12 22:29:46.361379 containerd[1508]: time="2025-09-12T22:29:46.361140507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qbgs5,Uid:f8f746ec-9046-44a0-81cf-c7041d7340b7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.361655 kubelet[2658]: E0912 22:29:46.361522 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.361655 kubelet[2658]: E0912 22:29:46.361555 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-qbgs5" Sep 12 22:29:46.361655 kubelet[2658]: E0912 22:29:46.361580 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-qbgs5" Sep 12 22:29:46.361802 kubelet[2658]: E0912 22:29:46.361608 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-qbgs5_kube-system(f8f746ec-9046-44a0-81cf-c7041d7340b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-qbgs5_kube-system(f8f746ec-9046-44a0-81cf-c7041d7340b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cec50af6e0447cccf26dba83e60cf3acbc22f9dbb8b61b377ce9d82c15ef9956\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-qbgs5" podUID="f8f746ec-9046-44a0-81cf-c7041d7340b7" Sep 12 22:29:46.363782 containerd[1508]: time="2025-09-12T22:29:46.363735429Z" level=error msg="Failed to destroy network for sandbox \"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.364726 containerd[1508]: time="2025-09-12T22:29:46.364696379Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2d4dt,Uid:279e3805-39cc-475e-8130-1c4b4a1b7121,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.365024 kubelet[2658]: E0912 22:29:46.364844 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.365024 kubelet[2658]: E0912 22:29:46.364876 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.365024 kubelet[2658]: E0912 22:29:46.364895 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-2d4dt" Sep 12 22:29:46.365105 kubelet[2658]: E0912 22:29:46.364927 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-2d4dt_calico-system(279e3805-39cc-475e-8130-1c4b4a1b7121)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-2d4dt_calico-system(279e3805-39cc-475e-8130-1c4b4a1b7121)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4734a13970b953c4bad57799bf8c7b7eb0f32b56cabefd479a16a2e847609ece\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-2d4dt" podUID="279e3805-39cc-475e-8130-1c4b4a1b7121" Sep 12 22:29:46.366359 containerd[1508]: time="2025-09-12T22:29:46.366327910Z" level=error msg="Failed to destroy network for sandbox \"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.367180 containerd[1508]: time="2025-09-12T22:29:46.367152576Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-858d97f957-kmmqb,Uid:9f1b6925-b20d-4f01-9ee0-8b249e16ce97,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.367339 kubelet[2658]: E0912 22:29:46.367315 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.367469 kubelet[2658]: E0912 22:29:46.367426 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-858d97f957-kmmqb" Sep 12 22:29:46.367469 kubelet[2658]: E0912 22:29:46.367445 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-858d97f957-kmmqb" Sep 12 22:29:46.367625 kubelet[2658]: E0912 22:29:46.367577 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-858d97f957-kmmqb_calico-system(9f1b6925-b20d-4f01-9ee0-8b249e16ce97)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-858d97f957-kmmqb_calico-system(9f1b6925-b20d-4f01-9ee0-8b249e16ce97)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d697e02417b3811dd7b5f1240294fd45b584898819a27d89e0c95b02c6659710\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-858d97f957-kmmqb" podUID="9f1b6925-b20d-4f01-9ee0-8b249e16ce97" Sep 12 22:29:46.370047 containerd[1508]: time="2025-09-12T22:29:46.370017667Z" level=error msg="Failed to destroy network for sandbox \"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.371299 containerd[1508]: time="2025-09-12T22:29:46.371271066Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-zb94s,Uid:e026fd86-5133-41c6-a136-1b3785615a66,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.371538 kubelet[2658]: E0912 22:29:46.371472 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.371711 kubelet[2658]: E0912 22:29:46.371631 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" Sep 12 22:29:46.371711 kubelet[2658]: E0912 22:29:46.371688 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" Sep 12 22:29:46.371904 kubelet[2658]: E0912 22:29:46.371831 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57f645bfdb-zb94s_calico-apiserver(e026fd86-5133-41c6-a136-1b3785615a66)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57f645bfdb-zb94s_calico-apiserver(e026fd86-5133-41c6-a136-1b3785615a66)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db436a008e1e51b4ee044aa2c8c69b361db1e090981d03cdba253dca70ae5463\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" podUID="e026fd86-5133-41c6-a136-1b3785615a66" Sep 12 22:29:46.384891 containerd[1508]: time="2025-09-12T22:29:46.384853615Z" level=error msg="Failed to destroy network for sandbox \"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.385705 containerd[1508]: time="2025-09-12T22:29:46.385663961Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-m9hn4,Uid:86cb6c26-0c34-4ae2-b007-31748a17a456,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.385921 kubelet[2658]: E0912 22:29:46.385898 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.386043 kubelet[2658]: E0912 22:29:46.386027 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" Sep 12 22:29:46.386120 kubelet[2658]: E0912 22:29:46.386094 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" Sep 12 22:29:46.386198 kubelet[2658]: E0912 22:29:46.386178 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57f645bfdb-m9hn4_calico-apiserver(86cb6c26-0c34-4ae2-b007-31748a17a456)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57f645bfdb-m9hn4_calico-apiserver(86cb6c26-0c34-4ae2-b007-31748a17a456)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2072e637e864b88c180b92ce203c375d656c405e4ac0bd489f48fbe8bf7d7d52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" podUID="86cb6c26-0c34-4ae2-b007-31748a17a456" Sep 12 22:29:46.515084 systemd[1]: Created slice kubepods-besteffort-podca323b23_e1f4_4eab_9ac9_c732b46b6287.slice - libcontainer container kubepods-besteffort-podca323b23_e1f4_4eab_9ac9_c732b46b6287.slice. Sep 12 22:29:46.517256 containerd[1508]: time="2025-09-12T22:29:46.517220153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-69gts,Uid:ca323b23-e1f4-4eab-9ac9-c732b46b6287,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:46.559412 containerd[1508]: time="2025-09-12T22:29:46.559362123Z" level=error msg="Failed to destroy network for sandbox \"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.560342 containerd[1508]: time="2025-09-12T22:29:46.560298913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-69gts,Uid:ca323b23-e1f4-4eab-9ac9-c732b46b6287,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.560770 kubelet[2658]: E0912 22:29:46.560502 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:29:46.560770 kubelet[2658]: E0912 22:29:46.560555 2658 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:46.560770 kubelet[2658]: E0912 22:29:46.560572 2658 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-69gts" Sep 12 22:29:46.560872 kubelet[2658]: E0912 22:29:46.560613 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-69gts_calico-system(ca323b23-e1f4-4eab-9ac9-c732b46b6287)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-69gts_calico-system(ca323b23-e1f4-4eab-9ac9-c732b46b6287)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f0b2b1b8bf4565d1c6d64e4d88fb268c828618db9f1c38bf2c45dca6f22b3308\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-69gts" podUID="ca323b23-e1f4-4eab-9ac9-c732b46b6287" Sep 12 22:29:46.613560 containerd[1508]: time="2025-09-12T22:29:46.613452391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:29:46.621498 kubelet[2658]: I0912 22:29:46.621450 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:29:49.652277 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2922093312.mount: Deactivated successfully. Sep 12 22:29:49.895287 containerd[1508]: time="2025-09-12T22:29:49.895236001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:49.908651 containerd[1508]: time="2025-09-12T22:29:49.895783817Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 22:29:49.908651 containerd[1508]: time="2025-09-12T22:29:49.896569759Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:49.908886 containerd[1508]: time="2025-09-12T22:29:49.898867383Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.285199466s" Sep 12 22:29:49.908886 containerd[1508]: time="2025-09-12T22:29:49.908825861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 22:29:49.909273 containerd[1508]: time="2025-09-12T22:29:49.909250953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:49.925419 containerd[1508]: time="2025-09-12T22:29:49.925392565Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:29:49.937926 containerd[1508]: time="2025-09-12T22:29:49.937520944Z" level=info msg="Container 9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:49.948079 containerd[1508]: time="2025-09-12T22:29:49.948037119Z" level=info msg="CreateContainer within sandbox \"a818f98e6a179be1c558e0d14b5057f7648ba68c2982ec3c738a3e67432897bd\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\"" Sep 12 22:29:49.948681 containerd[1508]: time="2025-09-12T22:29:49.948640776Z" level=info msg="StartContainer for \"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\"" Sep 12 22:29:49.950281 containerd[1508]: time="2025-09-12T22:29:49.950249261Z" level=info msg="connecting to shim 9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd" address="unix:///run/containerd/s/3ca69c1b968f281b37752fa5d417493e81d3d89f08193ab558b1a9f292200a06" protocol=ttrpc version=3 Sep 12 22:29:49.975815 systemd[1]: Started cri-containerd-9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd.scope - libcontainer container 9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd. Sep 12 22:29:50.013038 containerd[1508]: time="2025-09-12T22:29:50.012967524Z" level=info msg="StartContainer for \"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\" returns successfully" Sep 12 22:29:50.135505 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:29:50.135603 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:29:50.418155 kubelet[2658]: I0912 22:29:50.418096 2658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpdhf\" (UniqueName: \"kubernetes.io/projected/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-kube-api-access-qpdhf\") pod \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " Sep 12 22:29:50.418491 kubelet[2658]: I0912 22:29:50.418396 2658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-ca-bundle\") pod \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " Sep 12 22:29:50.418491 kubelet[2658]: I0912 22:29:50.418421 2658 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-backend-key-pair\") pod \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\" (UID: \"9f1b6925-b20d-4f01-9ee0-8b249e16ce97\") " Sep 12 22:29:50.422551 kubelet[2658]: I0912 22:29:50.422497 2658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9f1b6925-b20d-4f01-9ee0-8b249e16ce97" (UID: "9f1b6925-b20d-4f01-9ee0-8b249e16ce97"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 22:29:50.422809 kubelet[2658]: I0912 22:29:50.422778 2658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9f1b6925-b20d-4f01-9ee0-8b249e16ce97" (UID: "9f1b6925-b20d-4f01-9ee0-8b249e16ce97"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 22:29:50.422911 kubelet[2658]: I0912 22:29:50.422893 2658 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-kube-api-access-qpdhf" (OuterVolumeSpecName: "kube-api-access-qpdhf") pod "9f1b6925-b20d-4f01-9ee0-8b249e16ce97" (UID: "9f1b6925-b20d-4f01-9ee0-8b249e16ce97"). InnerVolumeSpecName "kube-api-access-qpdhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 22:29:50.518960 kubelet[2658]: I0912 22:29:50.518900 2658 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpdhf\" (UniqueName: \"kubernetes.io/projected/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-kube-api-access-qpdhf\") on node \"localhost\" DevicePath \"\"" Sep 12 22:29:50.518960 kubelet[2658]: I0912 22:29:50.518938 2658 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 22:29:50.518960 kubelet[2658]: I0912 22:29:50.518948 2658 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9f1b6925-b20d-4f01-9ee0-8b249e16ce97-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 22:29:50.653032 systemd[1]: var-lib-kubelet-pods-9f1b6925\x2db20d\x2d4f01\x2d9ee0\x2d8b249e16ce97-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqpdhf.mount: Deactivated successfully. Sep 12 22:29:50.653512 systemd[1]: var-lib-kubelet-pods-9f1b6925\x2db20d\x2d4f01\x2d9ee0\x2d8b249e16ce97-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:29:50.666491 systemd[1]: Removed slice kubepods-besteffort-pod9f1b6925_b20d_4f01_9ee0_8b249e16ce97.slice - libcontainer container kubepods-besteffort-pod9f1b6925_b20d_4f01_9ee0_8b249e16ce97.slice. Sep 12 22:29:50.680976 kubelet[2658]: I0912 22:29:50.680840 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-2p8jm" podStartSLOduration=1.746141556 podStartE2EDuration="12.680825513s" podCreationTimestamp="2025-09-12 22:29:38 +0000 UTC" firstStartedPulling="2025-09-12 22:29:38.975176053 +0000 UTC m=+19.533434794" lastFinishedPulling="2025-09-12 22:29:49.90986005 +0000 UTC m=+30.468118751" observedRunningTime="2025-09-12 22:29:50.680477864 +0000 UTC m=+31.238736605" watchObservedRunningTime="2025-09-12 22:29:50.680825513 +0000 UTC m=+31.239084254" Sep 12 22:29:50.723378 systemd[1]: Created slice kubepods-besteffort-pod251437d8_6fed_412f_acb3_25fcd172e498.slice - libcontainer container kubepods-besteffort-pod251437d8_6fed_412f_acb3_25fcd172e498.slice. Sep 12 22:29:50.822198 kubelet[2658]: I0912 22:29:50.822149 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmq2\" (UniqueName: \"kubernetes.io/projected/251437d8-6fed-412f-acb3-25fcd172e498-kube-api-access-4kmq2\") pod \"whisker-5888bd5b8d-c5hj9\" (UID: \"251437d8-6fed-412f-acb3-25fcd172e498\") " pod="calico-system/whisker-5888bd5b8d-c5hj9" Sep 12 22:29:50.822332 kubelet[2658]: I0912 22:29:50.822224 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/251437d8-6fed-412f-acb3-25fcd172e498-whisker-backend-key-pair\") pod \"whisker-5888bd5b8d-c5hj9\" (UID: \"251437d8-6fed-412f-acb3-25fcd172e498\") " pod="calico-system/whisker-5888bd5b8d-c5hj9" Sep 12 22:29:50.822332 kubelet[2658]: I0912 22:29:50.822241 2658 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/251437d8-6fed-412f-acb3-25fcd172e498-whisker-ca-bundle\") pod \"whisker-5888bd5b8d-c5hj9\" (UID: \"251437d8-6fed-412f-acb3-25fcd172e498\") " pod="calico-system/whisker-5888bd5b8d-c5hj9" Sep 12 22:29:51.026355 containerd[1508]: time="2025-09-12T22:29:51.026319634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5888bd5b8d-c5hj9,Uid:251437d8-6fed-412f-acb3-25fcd172e498,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:51.167885 systemd-networkd[1437]: cali10d68664a99: Link UP Sep 12 22:29:51.168910 systemd-networkd[1437]: cali10d68664a99: Gained carrier Sep 12 22:29:51.182727 containerd[1508]: 2025-09-12 22:29:51.047 [INFO][3828] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:29:51.182727 containerd[1508]: 2025-09-12 22:29:51.074 [INFO][3828] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0 whisker-5888bd5b8d- calico-system 251437d8-6fed-412f-acb3-25fcd172e498 905 0 2025-09-12 22:29:50 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5888bd5b8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5888bd5b8d-c5hj9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali10d68664a99 [] [] }} ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-" Sep 12 22:29:51.182727 containerd[1508]: 2025-09-12 22:29:51.074 [INFO][3828] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.182727 containerd[1508]: 2025-09-12 22:29:51.126 [INFO][3842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" HandleID="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Workload="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.126 [INFO][3842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" HandleID="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Workload="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000503980), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5888bd5b8d-c5hj9", "timestamp":"2025-09-12 22:29:51.126299988 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.126 [INFO][3842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.126 [INFO][3842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.126 [INFO][3842] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.135 [INFO][3842] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" host="localhost" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.141 [INFO][3842] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.145 [INFO][3842] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.146 [INFO][3842] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.148 [INFO][3842] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:51.182937 containerd[1508]: 2025-09-12 22:29:51.148 [INFO][3842] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" host="localhost" Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.149 [INFO][3842] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6 Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.153 [INFO][3842] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" host="localhost" Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.157 [INFO][3842] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" host="localhost" Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.158 [INFO][3842] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" host="localhost" Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.158 [INFO][3842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:51.183134 containerd[1508]: 2025-09-12 22:29:51.158 [INFO][3842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" HandleID="k8s-pod-network.5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Workload="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.183277 containerd[1508]: 2025-09-12 22:29:51.160 [INFO][3828] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0", GenerateName:"whisker-5888bd5b8d-", Namespace:"calico-system", SelfLink:"", UID:"251437d8-6fed-412f-acb3-25fcd172e498", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5888bd5b8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5888bd5b8d-c5hj9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali10d68664a99", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:51.183277 containerd[1508]: 2025-09-12 22:29:51.160 [INFO][3828] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.183342 containerd[1508]: 2025-09-12 22:29:51.160 [INFO][3828] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali10d68664a99 ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.183342 containerd[1508]: 2025-09-12 22:29:51.169 [INFO][3828] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.183381 containerd[1508]: 2025-09-12 22:29:51.169 [INFO][3828] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0", GenerateName:"whisker-5888bd5b8d-", Namespace:"calico-system", SelfLink:"", UID:"251437d8-6fed-412f-acb3-25fcd172e498", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5888bd5b8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6", Pod:"whisker-5888bd5b8d-c5hj9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali10d68664a99", MAC:"56:90:8a:12:0a:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:51.183426 containerd[1508]: 2025-09-12 22:29:51.180 [INFO][3828] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" Namespace="calico-system" Pod="whisker-5888bd5b8d-c5hj9" WorkloadEndpoint="localhost-k8s-whisker--5888bd5b8d--c5hj9-eth0" Sep 12 22:29:51.219641 containerd[1508]: time="2025-09-12T22:29:51.219137118Z" level=info msg="connecting to shim 5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6" address="unix:///run/containerd/s/c7cf3d0778460a27736ea62f4c8ac9549f457e08d195c5a0eef0bfe0a0fb317b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:51.242806 systemd[1]: Started cri-containerd-5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6.scope - libcontainer container 5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6. Sep 12 22:29:51.255407 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:29:51.273530 containerd[1508]: time="2025-09-12T22:29:51.273498489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5888bd5b8d-c5hj9,Uid:251437d8-6fed-412f-acb3-25fcd172e498,Namespace:calico-system,Attempt:0,} returns sandbox id \"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6\"" Sep 12 22:29:51.275282 containerd[1508]: time="2025-09-12T22:29:51.275099090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:29:51.514690 kubelet[2658]: I0912 22:29:51.514639 2658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1b6925-b20d-4f01-9ee0-8b249e16ce97" path="/var/lib/kubelet/pods/9f1b6925-b20d-4f01-9ee0-8b249e16ce97/volumes" Sep 12 22:29:51.664297 kubelet[2658]: I0912 22:29:51.664265 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:29:51.798131 systemd-networkd[1437]: vxlan.calico: Link UP Sep 12 22:29:51.798137 systemd-networkd[1437]: vxlan.calico: Gained carrier Sep 12 22:29:52.175574 containerd[1508]: time="2025-09-12T22:29:52.175465217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:52.176154 containerd[1508]: time="2025-09-12T22:29:52.176117633Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 22:29:52.177261 containerd[1508]: time="2025-09-12T22:29:52.176935333Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:52.178978 containerd[1508]: time="2025-09-12T22:29:52.178942544Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:52.180185 containerd[1508]: time="2025-09-12T22:29:52.180139254Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 905.007362ms" Sep 12 22:29:52.180185 containerd[1508]: time="2025-09-12T22:29:52.180179615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 22:29:52.182245 containerd[1508]: time="2025-09-12T22:29:52.182218426Z" level=info msg="CreateContainer within sandbox \"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:29:52.189202 containerd[1508]: time="2025-09-12T22:29:52.188359139Z" level=info msg="Container 703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:52.191894 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4175683024.mount: Deactivated successfully. Sep 12 22:29:52.199694 containerd[1508]: time="2025-09-12T22:29:52.199634742Z" level=info msg="CreateContainer within sandbox \"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6\"" Sep 12 22:29:52.205374 containerd[1508]: time="2025-09-12T22:29:52.205342124Z" level=info msg="StartContainer for \"703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6\"" Sep 12 22:29:52.206349 containerd[1508]: time="2025-09-12T22:29:52.206325109Z" level=info msg="connecting to shim 703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6" address="unix:///run/containerd/s/c7cf3d0778460a27736ea62f4c8ac9549f457e08d195c5a0eef0bfe0a0fb317b" protocol=ttrpc version=3 Sep 12 22:29:52.231816 systemd[1]: Started cri-containerd-703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6.scope - libcontainer container 703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6. Sep 12 22:29:52.270015 containerd[1508]: time="2025-09-12T22:29:52.269821339Z" level=info msg="StartContainer for \"703a4f1949b2ea16c41c745f5cb30773b129c6e815c932558063c6d5601796b6\" returns successfully" Sep 12 22:29:52.271576 containerd[1508]: time="2025-09-12T22:29:52.271549902Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:29:52.775807 systemd-networkd[1437]: cali10d68664a99: Gained IPv6LL Sep 12 22:29:53.031815 systemd-networkd[1437]: vxlan.calico: Gained IPv6LL Sep 12 22:29:53.455051 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3937423665.mount: Deactivated successfully. Sep 12 22:29:53.526783 containerd[1508]: time="2025-09-12T22:29:53.526739990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:53.527334 containerd[1508]: time="2025-09-12T22:29:53.527305083Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 22:29:53.528180 containerd[1508]: time="2025-09-12T22:29:53.528159064Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:53.530154 containerd[1508]: time="2025-09-12T22:29:53.530124231Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:29:53.530774 containerd[1508]: time="2025-09-12T22:29:53.530750487Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.259002579s" Sep 12 22:29:53.530828 containerd[1508]: time="2025-09-12T22:29:53.530779167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 22:29:53.533043 containerd[1508]: time="2025-09-12T22:29:53.533012021Z" level=info msg="CreateContainer within sandbox \"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:29:53.539339 containerd[1508]: time="2025-09-12T22:29:53.538817162Z" level=info msg="Container 7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:29:53.550029 containerd[1508]: time="2025-09-12T22:29:53.549996032Z" level=info msg="CreateContainer within sandbox \"5159a0f4bb1196a373eca3f7bdfe06089c91667cc9625b2a7568775ed85a51e6\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b\"" Sep 12 22:29:53.550637 containerd[1508]: time="2025-09-12T22:29:53.550604126Z" level=info msg="StartContainer for \"7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b\"" Sep 12 22:29:53.551954 containerd[1508]: time="2025-09-12T22:29:53.551929038Z" level=info msg="connecting to shim 7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b" address="unix:///run/containerd/s/c7cf3d0778460a27736ea62f4c8ac9549f457e08d195c5a0eef0bfe0a0fb317b" protocol=ttrpc version=3 Sep 12 22:29:53.576824 systemd[1]: Started cri-containerd-7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b.scope - libcontainer container 7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b. Sep 12 22:29:53.613934 containerd[1508]: time="2025-09-12T22:29:53.613871776Z" level=info msg="StartContainer for \"7b1ecf5474bd963bbfc62752c6f0c1b61b3754d9857e9736e6829721c5af162b\" returns successfully" Sep 12 22:29:53.683409 kubelet[2658]: I0912 22:29:53.683327 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5888bd5b8d-c5hj9" podStartSLOduration=1.426223462 podStartE2EDuration="3.683311054s" podCreationTimestamp="2025-09-12 22:29:50 +0000 UTC" firstStartedPulling="2025-09-12 22:29:51.274625158 +0000 UTC m=+31.832883899" lastFinishedPulling="2025-09-12 22:29:53.53171275 +0000 UTC m=+34.089971491" observedRunningTime="2025-09-12 22:29:53.682537475 +0000 UTC m=+34.240796216" watchObservedRunningTime="2025-09-12 22:29:53.683311054 +0000 UTC m=+34.241569795" Sep 12 22:29:56.096378 kubelet[2658]: I0912 22:29:56.096317 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:29:56.220187 containerd[1508]: time="2025-09-12T22:29:56.220144452Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\" id:\"56d7798037331ff5c80c6e26f5cacf35fc90c3f141d282d2d22840cc00a6a7ec\" pid:4200 exited_at:{seconds:1757716196 nanos:219770483}" Sep 12 22:29:56.295361 containerd[1508]: time="2025-09-12T22:29:56.295318497Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\" id:\"b236a97feba63905cdc4997bc3fa268f548dddfc94f16c6706551b8ce694c83d\" pid:4228 exited_at:{seconds:1757716196 nanos:294999770}" Sep 12 22:29:57.510441 containerd[1508]: time="2025-09-12T22:29:57.510363434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2d4dt,Uid:279e3805-39cc-475e-8130-1c4b4a1b7121,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:57.621758 systemd-networkd[1437]: cali5b5c96cdf86: Link UP Sep 12 22:29:57.622003 systemd-networkd[1437]: cali5b5c96cdf86: Gained carrier Sep 12 22:29:57.638936 containerd[1508]: 2025-09-12 22:29:57.548 [INFO][4250] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--2d4dt-eth0 goldmane-7988f88666- calico-system 279e3805-39cc-475e-8130-1c4b4a1b7121 828 0 2025-09-12 22:29:38 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-2d4dt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5b5c96cdf86 [] [] }} ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-" Sep 12 22:29:57.638936 containerd[1508]: 2025-09-12 22:29:57.548 [INFO][4250] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.638936 containerd[1508]: 2025-09-12 22:29:57.576 [INFO][4264] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" HandleID="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Workload="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.577 [INFO][4264] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" HandleID="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Workload="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a1620), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-2d4dt", "timestamp":"2025-09-12 22:29:57.576896206 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.577 [INFO][4264] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.577 [INFO][4264] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.577 [INFO][4264] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.586 [INFO][4264] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" host="localhost" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.590 [INFO][4264] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.594 [INFO][4264] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.595 [INFO][4264] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.597 [INFO][4264] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:57.639142 containerd[1508]: 2025-09-12 22:29:57.597 [INFO][4264] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" host="localhost" Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.598 [INFO][4264] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.611 [INFO][4264] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" host="localhost" Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.616 [INFO][4264] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" host="localhost" Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.616 [INFO][4264] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" host="localhost" Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.616 [INFO][4264] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:57.639365 containerd[1508]: 2025-09-12 22:29:57.616 [INFO][4264] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" HandleID="k8s-pod-network.580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Workload="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.639483 containerd[1508]: 2025-09-12 22:29:57.618 [INFO][4250] cni-plugin/k8s.go 418: Populated endpoint ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--2d4dt-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"279e3805-39cc-475e-8130-1c4b4a1b7121", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-2d4dt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5b5c96cdf86", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:57.639483 containerd[1508]: 2025-09-12 22:29:57.618 [INFO][4250] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.639554 containerd[1508]: 2025-09-12 22:29:57.618 [INFO][4250] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b5c96cdf86 ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.639554 containerd[1508]: 2025-09-12 22:29:57.623 [INFO][4250] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.639599 containerd[1508]: 2025-09-12 22:29:57.624 [INFO][4250] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--2d4dt-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"279e3805-39cc-475e-8130-1c4b4a1b7121", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d", Pod:"goldmane-7988f88666-2d4dt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5b5c96cdf86", MAC:"e2:25:59:13:13:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:57.639768 containerd[1508]: 2025-09-12 22:29:57.634 [INFO][4250] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" Namespace="calico-system" Pod="goldmane-7988f88666-2d4dt" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--2d4dt-eth0" Sep 12 22:29:57.658835 containerd[1508]: time="2025-09-12T22:29:57.658795745Z" level=info msg="connecting to shim 580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d" address="unix:///run/containerd/s/91c778902ad250bcfb43f6cddaefaf0f567e642d5060b341151bd1d92547070a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:57.683831 systemd[1]: Started cri-containerd-580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d.scope - libcontainer container 580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d. Sep 12 22:29:57.694145 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:29:57.725639 containerd[1508]: time="2025-09-12T22:29:57.725589522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2d4dt,Uid:279e3805-39cc-475e-8130-1c4b4a1b7121,Namespace:calico-system,Attempt:0,} returns sandbox id \"580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d\"" Sep 12 22:29:57.726940 containerd[1508]: time="2025-09-12T22:29:57.726918350Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:29:58.511131 containerd[1508]: time="2025-09-12T22:29:58.510896671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-zb94s,Uid:e026fd86-5133-41c6-a136-1b3785615a66,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:29:58.514645 containerd[1508]: time="2025-09-12T22:29:58.511283719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-m9hn4,Uid:86cb6c26-0c34-4ae2-b007-31748a17a456,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:29:58.700526 systemd-networkd[1437]: calidc6a31a6075: Link UP Sep 12 22:29:58.701214 systemd-networkd[1437]: calidc6a31a6075: Gained carrier Sep 12 22:29:58.716925 containerd[1508]: 2025-09-12 22:29:58.566 [INFO][4330] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0 calico-apiserver-57f645bfdb- calico-apiserver e026fd86-5133-41c6-a136-1b3785615a66 824 0 2025-09-12 22:29:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57f645bfdb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57f645bfdb-zb94s eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidc6a31a6075 [] [] }} ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-" Sep 12 22:29:58.716925 containerd[1508]: 2025-09-12 22:29:58.567 [INFO][4330] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.716925 containerd[1508]: 2025-09-12 22:29:58.607 [INFO][4358] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" HandleID="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Workload="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.608 [INFO][4358] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" HandleID="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Workload="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400035d030), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57f645bfdb-zb94s", "timestamp":"2025-09-12 22:29:58.607899149 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.608 [INFO][4358] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.608 [INFO][4358] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.608 [INFO][4358] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.641 [INFO][4358] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" host="localhost" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.648 [INFO][4358] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.656 [INFO][4358] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.666 [INFO][4358] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.672 [INFO][4358] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:58.717197 containerd[1508]: 2025-09-12 22:29:58.672 [INFO][4358] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" host="localhost" Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.675 [INFO][4358] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4 Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.683 [INFO][4358] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" host="localhost" Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.691 [INFO][4358] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" host="localhost" Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.692 [INFO][4358] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" host="localhost" Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.692 [INFO][4358] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:58.717594 containerd[1508]: 2025-09-12 22:29:58.692 [INFO][4358] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" HandleID="k8s-pod-network.b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Workload="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.717752 containerd[1508]: 2025-09-12 22:29:58.694 [INFO][4330] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0", GenerateName:"calico-apiserver-57f645bfdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"e026fd86-5133-41c6-a136-1b3785615a66", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57f645bfdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57f645bfdb-zb94s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidc6a31a6075", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:58.717808 containerd[1508]: 2025-09-12 22:29:58.694 [INFO][4330] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.717808 containerd[1508]: 2025-09-12 22:29:58.694 [INFO][4330] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc6a31a6075 ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.717808 containerd[1508]: 2025-09-12 22:29:58.701 [INFO][4330] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.717873 containerd[1508]: 2025-09-12 22:29:58.702 [INFO][4330] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0", GenerateName:"calico-apiserver-57f645bfdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"e026fd86-5133-41c6-a136-1b3785615a66", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57f645bfdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4", Pod:"calico-apiserver-57f645bfdb-zb94s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidc6a31a6075", MAC:"b2:cc:c4:9e:9d:d3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:58.717921 containerd[1508]: 2025-09-12 22:29:58.712 [INFO][4330] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-zb94s" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--zb94s-eth0" Sep 12 22:29:58.738812 containerd[1508]: time="2025-09-12T22:29:58.738774725Z" level=info msg="connecting to shim b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4" address="unix:///run/containerd/s/1f362ccf291c757e3f54debe2ae11d84886cf4a3092abf8801ce8cf2036616c5" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:58.764935 systemd[1]: Started cri-containerd-b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4.scope - libcontainer container b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4. Sep 12 22:29:58.787018 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:29:58.791625 systemd-networkd[1437]: calic360200d3ce: Link UP Sep 12 22:29:58.792440 systemd-networkd[1437]: calic360200d3ce: Gained carrier Sep 12 22:29:58.813095 containerd[1508]: 2025-09-12 22:29:58.589 [INFO][4341] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0 calico-apiserver-57f645bfdb- calico-apiserver 86cb6c26-0c34-4ae2-b007-31748a17a456 826 0 2025-09-12 22:29:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57f645bfdb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57f645bfdb-m9hn4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic360200d3ce [] [] }} ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-" Sep 12 22:29:58.813095 containerd[1508]: 2025-09-12 22:29:58.589 [INFO][4341] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.813095 containerd[1508]: 2025-09-12 22:29:58.650 [INFO][4366] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" HandleID="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Workload="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.651 [INFO][4366] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" HandleID="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Workload="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136400), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57f645bfdb-m9hn4", "timestamp":"2025-09-12 22:29:58.650859554 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.651 [INFO][4366] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.692 [INFO][4366] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.692 [INFO][4366] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.721 [INFO][4366] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" host="localhost" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.749 [INFO][4366] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.756 [INFO][4366] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.759 [INFO][4366] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.763 [INFO][4366] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:58.813469 containerd[1508]: 2025-09-12 22:29:58.763 [INFO][4366] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" host="localhost" Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.768 [INFO][4366] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7 Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.772 [INFO][4366] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" host="localhost" Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.779 [INFO][4366] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" host="localhost" Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.779 [INFO][4366] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" host="localhost" Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.780 [INFO][4366] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:58.813811 containerd[1508]: 2025-09-12 22:29:58.780 [INFO][4366] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" HandleID="k8s-pod-network.6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Workload="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.813936 containerd[1508]: 2025-09-12 22:29:58.784 [INFO][4341] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0", GenerateName:"calico-apiserver-57f645bfdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"86cb6c26-0c34-4ae2-b007-31748a17a456", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57f645bfdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57f645bfdb-m9hn4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic360200d3ce", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:58.814103 containerd[1508]: 2025-09-12 22:29:58.785 [INFO][4341] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.814103 containerd[1508]: 2025-09-12 22:29:58.785 [INFO][4341] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic360200d3ce ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.814103 containerd[1508]: 2025-09-12 22:29:58.794 [INFO][4341] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.814359 containerd[1508]: 2025-09-12 22:29:58.797 [INFO][4341] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0", GenerateName:"calico-apiserver-57f645bfdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"86cb6c26-0c34-4ae2-b007-31748a17a456", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57f645bfdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7", Pod:"calico-apiserver-57f645bfdb-m9hn4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic360200d3ce", MAC:"0e:31:ab:e3:63:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:58.814417 containerd[1508]: 2025-09-12 22:29:58.807 [INFO][4341] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" Namespace="calico-apiserver" Pod="calico-apiserver-57f645bfdb-m9hn4" WorkloadEndpoint="localhost-k8s-calico--apiserver--57f645bfdb--m9hn4-eth0" Sep 12 22:29:58.850843 containerd[1508]: time="2025-09-12T22:29:58.850768872Z" level=info msg="connecting to shim 6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7" address="unix:///run/containerd/s/f7f6610ee7abec4727b879e7ce87509b561e4d82e586e5357ca47894072b5376" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:58.852258 containerd[1508]: time="2025-09-12T22:29:58.852076738Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-zb94s,Uid:e026fd86-5133-41c6-a136-1b3785615a66,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4\"" Sep 12 22:29:58.872853 systemd[1]: Started cri-containerd-6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7.scope - libcontainer container 6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7. Sep 12 22:29:58.886298 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:29:58.944683 containerd[1508]: time="2025-09-12T22:29:58.944632925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57f645bfdb-m9hn4,Uid:86cb6c26-0c34-4ae2-b007-31748a17a456,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7\"" Sep 12 22:29:58.984821 systemd-networkd[1437]: cali5b5c96cdf86: Gained IPv6LL Sep 12 22:29:59.472553 systemd[1]: Started sshd@7-10.0.0.148:22-10.0.0.1:49606.service - OpenSSH per-connection server daemon (10.0.0.1:49606). Sep 12 22:29:59.519284 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount387098303.mount: Deactivated successfully. Sep 12 22:29:59.523426 containerd[1508]: time="2025-09-12T22:29:59.523394820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qbgs5,Uid:f8f746ec-9046-44a0-81cf-c7041d7340b7,Namespace:kube-system,Attempt:0,}" Sep 12 22:29:59.527715 containerd[1508]: time="2025-09-12T22:29:59.526789248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-69gts,Uid:ca323b23-e1f4-4eab-9ac9-c732b46b6287,Namespace:calico-system,Attempt:0,}" Sep 12 22:29:59.570695 sshd[4497]: Accepted publickey for core from 10.0.0.1 port 49606 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:29:59.571898 sshd-session[4497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:29:59.584733 systemd-logind[1485]: New session 8 of user core. Sep 12 22:29:59.586928 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:29:59.718590 systemd-networkd[1437]: cali6239ddd30b5: Link UP Sep 12 22:29:59.719355 systemd-networkd[1437]: cali6239ddd30b5: Gained carrier Sep 12 22:29:59.734811 containerd[1508]: 2025-09-12 22:29:59.593 [INFO][4500] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0 coredns-7c65d6cfc9- kube-system f8f746ec-9046-44a0-81cf-c7041d7340b7 827 0 2025-09-12 22:29:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-qbgs5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6239ddd30b5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-" Sep 12 22:29:59.734811 containerd[1508]: 2025-09-12 22:29:59.594 [INFO][4500] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.734811 containerd[1508]: 2025-09-12 22:29:59.636 [INFO][4531] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" HandleID="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Workload="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.636 [INFO][4531] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" HandleID="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Workload="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b110), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-qbgs5", "timestamp":"2025-09-12 22:29:59.636532564 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.636 [INFO][4531] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.636 [INFO][4531] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.636 [INFO][4531] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.657 [INFO][4531] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" host="localhost" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.664 [INFO][4531] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.670 [INFO][4531] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.673 [INFO][4531] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.677 [INFO][4531] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:59.734984 containerd[1508]: 2025-09-12 22:29:59.677 [INFO][4531] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" host="localhost" Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.682 [INFO][4531] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141 Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.694 [INFO][4531] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" host="localhost" Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.705 [INFO][4531] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" host="localhost" Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.705 [INFO][4531] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" host="localhost" Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.705 [INFO][4531] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:59.735340 containerd[1508]: 2025-09-12 22:29:59.705 [INFO][4531] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" HandleID="k8s-pod-network.904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Workload="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.735456 containerd[1508]: 2025-09-12 22:29:59.714 [INFO][4500] cni-plugin/k8s.go 418: Populated endpoint ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f8f746ec-9046-44a0-81cf-c7041d7340b7", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-qbgs5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6239ddd30b5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:59.735521 containerd[1508]: 2025-09-12 22:29:59.715 [INFO][4500] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.735521 containerd[1508]: 2025-09-12 22:29:59.715 [INFO][4500] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6239ddd30b5 ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.735521 containerd[1508]: 2025-09-12 22:29:59.717 [INFO][4500] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.735593 containerd[1508]: 2025-09-12 22:29:59.717 [INFO][4500] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f8f746ec-9046-44a0-81cf-c7041d7340b7", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141", Pod:"coredns-7c65d6cfc9-qbgs5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6239ddd30b5", MAC:"a2:1d:b8:a0:a3:8e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:59.735593 containerd[1508]: 2025-09-12 22:29:59.731 [INFO][4500] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qbgs5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--qbgs5-eth0" Sep 12 22:29:59.781515 containerd[1508]: time="2025-09-12T22:29:59.778991895Z" level=info msg="connecting to shim 904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141" address="unix:///run/containerd/s/aa4a7dd4d5151c58b351790442a0f7d53ff44def905843140d4db381d1a6b271" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:59.822860 systemd-networkd[1437]: cali94d65bc04e7: Link UP Sep 12 22:29:59.823495 systemd-networkd[1437]: cali94d65bc04e7: Gained carrier Sep 12 22:29:59.842836 systemd[1]: Started cri-containerd-904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141.scope - libcontainer container 904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141. Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.631 [INFO][4502] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--69gts-eth0 csi-node-driver- calico-system ca323b23-e1f4-4eab-9ac9-c732b46b6287 718 0 2025-09-12 22:29:38 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-69gts eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali94d65bc04e7 [] [] }} ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.632 [INFO][4502] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.675 [INFO][4544] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" HandleID="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Workload="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.675 [INFO][4544] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" HandleID="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Workload="localhost-k8s-csi--node--driver--69gts-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012f550), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-69gts", "timestamp":"2025-09-12 22:29:59.675392582 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.675 [INFO][4544] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.706 [INFO][4544] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.706 [INFO][4544] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.759 [INFO][4544] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.766 [INFO][4544] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.772 [INFO][4544] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.782 [INFO][4544] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.789 [INFO][4544] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.789 [INFO][4544] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.793 [INFO][4544] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5 Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.803 [INFO][4544] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.812 [INFO][4544] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.812 [INFO][4544] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" host="localhost" Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.812 [INFO][4544] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:29:59.850523 containerd[1508]: 2025-09-12 22:29:59.812 [INFO][4544] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" HandleID="k8s-pod-network.193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Workload="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.819 [INFO][4502] cni-plugin/k8s.go 418: Populated endpoint ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--69gts-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ca323b23-e1f4-4eab-9ac9-c732b46b6287", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-69gts", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali94d65bc04e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.820 [INFO][4502] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.820 [INFO][4502] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali94d65bc04e7 ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.823 [INFO][4502] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.824 [INFO][4502] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--69gts-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ca323b23-e1f4-4eab-9ac9-c732b46b6287", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5", Pod:"csi-node-driver-69gts", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali94d65bc04e7", MAC:"fe:03:c4:2e:83:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:29:59.852188 containerd[1508]: 2025-09-12 22:29:59.846 [INFO][4502] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" Namespace="calico-system" Pod="csi-node-driver-69gts" WorkloadEndpoint="localhost-k8s-csi--node--driver--69gts-eth0" Sep 12 22:29:59.917243 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:29:59.932697 containerd[1508]: time="2025-09-12T22:29:59.932363564Z" level=info msg="connecting to shim 193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5" address="unix:///run/containerd/s/2de147cda23aa4f5d42aca6029e9b042b671c9db6e809afcab5e0f5ab24447fb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:29:59.935127 sshd[4529]: Connection closed by 10.0.0.1 port 49606 Sep 12 22:29:59.935844 sshd-session[4497]: pam_unix(sshd:session): session closed for user core Sep 12 22:29:59.941098 systemd[1]: sshd@7-10.0.0.148:22-10.0.0.1:49606.service: Deactivated successfully. Sep 12 22:29:59.942954 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:29:59.947425 systemd-logind[1485]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:29:59.950284 systemd-logind[1485]: Removed session 8. Sep 12 22:29:59.965072 systemd[1]: Started cri-containerd-193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5.scope - libcontainer container 193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5. Sep 12 22:29:59.975500 containerd[1508]: time="2025-09-12T22:29:59.975404185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qbgs5,Uid:f8f746ec-9046-44a0-81cf-c7041d7340b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141\"" Sep 12 22:29:59.989726 containerd[1508]: time="2025-09-12T22:29:59.986864615Z" level=info msg="CreateContainer within sandbox \"904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:29:59.994028 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:30:00.009441 containerd[1508]: time="2025-09-12T22:30:00.009402661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-69gts,Uid:ca323b23-e1f4-4eab-9ac9-c732b46b6287,Namespace:calico-system,Attempt:0,} returns sandbox id \"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5\"" Sep 12 22:30:00.017966 containerd[1508]: time="2025-09-12T22:30:00.017359616Z" level=info msg="Container 9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:00.022820 containerd[1508]: time="2025-09-12T22:30:00.022783321Z" level=info msg="CreateContainer within sandbox \"904d75a743ad77dbf8459757b1f16d90dac752f8919426ac0d7edda7eafa5141\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a\"" Sep 12 22:30:00.025304 containerd[1508]: time="2025-09-12T22:30:00.025278570Z" level=info msg="StartContainer for \"9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a\"" Sep 12 22:30:00.026732 containerd[1508]: time="2025-09-12T22:30:00.026666157Z" level=info msg="connecting to shim 9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a" address="unix:///run/containerd/s/aa4a7dd4d5151c58b351790442a0f7d53ff44def905843140d4db381d1a6b271" protocol=ttrpc version=3 Sep 12 22:30:00.051843 systemd[1]: Started cri-containerd-9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a.scope - libcontainer container 9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a. Sep 12 22:30:00.060496 containerd[1508]: time="2025-09-12T22:30:00.060444174Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:00.061335 containerd[1508]: time="2025-09-12T22:30:00.061295071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 22:30:00.062234 containerd[1508]: time="2025-09-12T22:30:00.062198249Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:00.064632 containerd[1508]: time="2025-09-12T22:30:00.064584735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:00.065641 containerd[1508]: time="2025-09-12T22:30:00.065592275Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.338643123s" Sep 12 22:30:00.065641 containerd[1508]: time="2025-09-12T22:30:00.065628915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 22:30:00.066831 containerd[1508]: time="2025-09-12T22:30:00.066807938Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:30:00.068874 containerd[1508]: time="2025-09-12T22:30:00.068764976Z" level=info msg="CreateContainer within sandbox \"580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:30:00.076730 containerd[1508]: time="2025-09-12T22:30:00.076695931Z" level=info msg="Container 749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:00.084062 containerd[1508]: time="2025-09-12T22:30:00.084014673Z" level=info msg="CreateContainer within sandbox \"580e3ffe5d75389bcd93585448e9526da385ff667cb301fd94f7f0b816ef0f8d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\"" Sep 12 22:30:00.084818 containerd[1508]: time="2025-09-12T22:30:00.084798288Z" level=info msg="StartContainer for \"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\"" Sep 12 22:30:00.085898 containerd[1508]: time="2025-09-12T22:30:00.085873069Z" level=info msg="connecting to shim 749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb" address="unix:///run/containerd/s/91c778902ad250bcfb43f6cddaefaf0f567e642d5060b341151bd1d92547070a" protocol=ttrpc version=3 Sep 12 22:30:00.091655 containerd[1508]: time="2025-09-12T22:30:00.091520259Z" level=info msg="StartContainer for \"9bbc0547dea8ab44ab9bc936fa8bdd35246357ed62d77aa430b1371f1059f49a\" returns successfully" Sep 12 22:30:00.113874 systemd[1]: Started cri-containerd-749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb.scope - libcontainer container 749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb. Sep 12 22:30:00.186881 containerd[1508]: time="2025-09-12T22:30:00.185438647Z" level=info msg="StartContainer for \"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\" returns successfully" Sep 12 22:30:00.199824 systemd-networkd[1437]: calidc6a31a6075: Gained IPv6LL Sep 12 22:30:00.510820 containerd[1508]: time="2025-09-12T22:30:00.510776579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vgc9k,Uid:f8fbdb4d-699f-4d60-8d98-dfc00c427c9f,Namespace:kube-system,Attempt:0,}" Sep 12 22:30:00.521947 systemd-networkd[1437]: calic360200d3ce: Gained IPv6LL Sep 12 22:30:00.628901 systemd-networkd[1437]: cali1fedc87b2d2: Link UP Sep 12 22:30:00.629503 systemd-networkd[1437]: cali1fedc87b2d2: Gained carrier Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.563 [INFO][4749] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0 coredns-7c65d6cfc9- kube-system f8fbdb4d-699f-4d60-8d98-dfc00c427c9f 819 0 2025-09-12 22:29:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-vgc9k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1fedc87b2d2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.563 [INFO][4749] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.587 [INFO][4764] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" HandleID="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Workload="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.587 [INFO][4764] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" HandleID="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Workload="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd910), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-vgc9k", "timestamp":"2025-09-12 22:30:00.587001422 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.587 [INFO][4764] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.587 [INFO][4764] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.587 [INFO][4764] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.596 [INFO][4764] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.601 [INFO][4764] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.605 [INFO][4764] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.606 [INFO][4764] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.609 [INFO][4764] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.609 [INFO][4764] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.610 [INFO][4764] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5 Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.614 [INFO][4764] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.622 [INFO][4764] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.622 [INFO][4764] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" host="localhost" Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.623 [INFO][4764] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:30:00.643623 containerd[1508]: 2025-09-12 22:30:00.623 [INFO][4764] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" HandleID="k8s-pod-network.d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Workload="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.625 [INFO][4749] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f8fbdb4d-699f-4d60-8d98-dfc00c427c9f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-vgc9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1fedc87b2d2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.625 [INFO][4749] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.625 [INFO][4749] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1fedc87b2d2 ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.629 [INFO][4749] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.629 [INFO][4749] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f8fbdb4d-699f-4d60-8d98-dfc00c427c9f", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5", Pod:"coredns-7c65d6cfc9-vgc9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1fedc87b2d2", MAC:"0a:fe:5b:a0:18:58", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:30:00.644357 containerd[1508]: 2025-09-12 22:30:00.640 [INFO][4749] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vgc9k" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--vgc9k-eth0" Sep 12 22:30:00.663447 containerd[1508]: time="2025-09-12T22:30:00.663397589Z" level=info msg="connecting to shim d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5" address="unix:///run/containerd/s/febcf74f231f6e84f744027b97c1a8274c5fc2e430d215a323ad289a6940a716" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:30:00.690012 systemd[1]: Started cri-containerd-d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5.scope - libcontainer container d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5. Sep 12 22:30:00.722807 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:30:00.728286 kubelet[2658]: I0912 22:30:00.728225 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-qbgs5" podStartSLOduration=35.728207491 podStartE2EDuration="35.728207491s" podCreationTimestamp="2025-09-12 22:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:30:00.715072915 +0000 UTC m=+41.273331656" watchObservedRunningTime="2025-09-12 22:30:00.728207491 +0000 UTC m=+41.286466232" Sep 12 22:30:00.765764 containerd[1508]: time="2025-09-12T22:30:00.764281753Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vgc9k,Uid:f8fbdb4d-699f-4d60-8d98-dfc00c427c9f,Namespace:kube-system,Attempt:0,} returns sandbox id \"d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5\"" Sep 12 22:30:00.769916 containerd[1508]: time="2025-09-12T22:30:00.769830221Z" level=info msg="CreateContainer within sandbox \"d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:30:00.781918 containerd[1508]: time="2025-09-12T22:30:00.781875495Z" level=info msg="Container e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:00.788378 containerd[1508]: time="2025-09-12T22:30:00.788339581Z" level=info msg="CreateContainer within sandbox \"d3f76964f9fdfce2684ecf243511b74f13af7eb6dad55de4a0ca61a5b40d19f5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a\"" Sep 12 22:30:00.788874 containerd[1508]: time="2025-09-12T22:30:00.788847711Z" level=info msg="StartContainer for \"e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a\"" Sep 12 22:30:00.789604 containerd[1508]: time="2025-09-12T22:30:00.789565685Z" level=info msg="connecting to shim e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a" address="unix:///run/containerd/s/febcf74f231f6e84f744027b97c1a8274c5fc2e430d215a323ad289a6940a716" protocol=ttrpc version=3 Sep 12 22:30:00.813158 systemd[1]: Started cri-containerd-e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a.scope - libcontainer container e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a. Sep 12 22:30:00.851849 containerd[1508]: time="2025-09-12T22:30:00.851805016Z" level=info msg="StartContainer for \"e3558bc202a401e87bca5ae8966b8945ef21d8b95d7700596904bac0183ffd4a\" returns successfully" Sep 12 22:30:00.910764 containerd[1508]: time="2025-09-12T22:30:00.910717963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\" id:\"c073027ebe7546b0164112260af6a316cb5da2dc5c70615c02cafcce25a2dfbd\" pid:4834 exited_at:{seconds:1757716200 nanos:909901907}" Sep 12 22:30:00.922504 kubelet[2658]: I0912 22:30:00.922432 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-2d4dt" podStartSLOduration=20.58243464 podStartE2EDuration="22.92241471s" podCreationTimestamp="2025-09-12 22:29:38 +0000 UTC" firstStartedPulling="2025-09-12 22:29:57.726726946 +0000 UTC m=+38.284985687" lastFinishedPulling="2025-09-12 22:30:00.066707016 +0000 UTC m=+40.624965757" observedRunningTime="2025-09-12 22:30:00.756504921 +0000 UTC m=+41.314763662" watchObservedRunningTime="2025-09-12 22:30:00.92241471 +0000 UTC m=+41.480673491" Sep 12 22:30:01.095814 systemd-networkd[1437]: cali6239ddd30b5: Gained IPv6LL Sep 12 22:30:01.514245 containerd[1508]: time="2025-09-12T22:30:01.514167283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76fdc566d5-xfzhc,Uid:ca9d44dd-f762-4c01-95d5-c6ef563e914c,Namespace:calico-system,Attempt:0,}" Sep 12 22:30:01.676150 systemd-networkd[1437]: calie497eaa66f5: Link UP Sep 12 22:30:01.677242 systemd-networkd[1437]: calie497eaa66f5: Gained carrier Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.577 [INFO][4894] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0 calico-kube-controllers-76fdc566d5- calico-system ca9d44dd-f762-4c01-95d5-c6ef563e914c 830 0 2025-09-12 22:29:39 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:76fdc566d5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-76fdc566d5-xfzhc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie497eaa66f5 [] [] }} ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.577 [INFO][4894] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.619 [INFO][4909] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" HandleID="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Workload="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.619 [INFO][4909] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" HandleID="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Workload="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000514e00), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-76fdc566d5-xfzhc", "timestamp":"2025-09-12 22:30:01.619747803 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.620 [INFO][4909] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.621 [INFO][4909] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.621 [INFO][4909] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.636 [INFO][4909] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.642 [INFO][4909] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.647 [INFO][4909] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.649 [INFO][4909] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.653 [INFO][4909] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.653 [INFO][4909] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.655 [INFO][4909] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.661 [INFO][4909] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.668 [INFO][4909] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.668 [INFO][4909] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" host="localhost" Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.668 [INFO][4909] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:30:01.701703 containerd[1508]: 2025-09-12 22:30:01.668 [INFO][4909] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" HandleID="k8s-pod-network.28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Workload="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.672 [INFO][4894] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0", GenerateName:"calico-kube-controllers-76fdc566d5-", Namespace:"calico-system", SelfLink:"", UID:"ca9d44dd-f762-4c01-95d5-c6ef563e914c", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76fdc566d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-76fdc566d5-xfzhc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie497eaa66f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.672 [INFO][4894] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.672 [INFO][4894] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie497eaa66f5 ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.678 [INFO][4894] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.678 [INFO][4894] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0", GenerateName:"calico-kube-controllers-76fdc566d5-", Namespace:"calico-system", SelfLink:"", UID:"ca9d44dd-f762-4c01-95d5-c6ef563e914c", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 29, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76fdc566d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f", Pod:"calico-kube-controllers-76fdc566d5-xfzhc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie497eaa66f5", MAC:"1a:32:55:f3:48:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:30:01.703168 containerd[1508]: 2025-09-12 22:30:01.697 [INFO][4894] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" Namespace="calico-system" Pod="calico-kube-controllers-76fdc566d5-xfzhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76fdc566d5--xfzhc-eth0" Sep 12 22:30:01.730844 containerd[1508]: time="2025-09-12T22:30:01.730796467Z" level=info msg="connecting to shim 28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f" address="unix:///run/containerd/s/257f494e54e8717a2d174b2157f30a041f9cfa2a05c58c4b564acf1564461ff1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:30:01.755894 kubelet[2658]: I0912 22:30:01.755183 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-vgc9k" podStartSLOduration=36.755163729 podStartE2EDuration="36.755163729s" podCreationTimestamp="2025-09-12 22:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:30:01.754241071 +0000 UTC m=+42.312499812" watchObservedRunningTime="2025-09-12 22:30:01.755163729 +0000 UTC m=+42.313422470" Sep 12 22:30:01.776087 systemd[1]: Started cri-containerd-28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f.scope - libcontainer container 28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f. Sep 12 22:30:01.795188 systemd-resolved[1356]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:30:01.799926 systemd-networkd[1437]: cali94d65bc04e7: Gained IPv6LL Sep 12 22:30:01.846348 containerd[1508]: time="2025-09-12T22:30:01.846239775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76fdc566d5-xfzhc,Uid:ca9d44dd-f762-4c01-95d5-c6ef563e914c,Namespace:calico-system,Attempt:0,} returns sandbox id \"28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f\"" Sep 12 22:30:02.342812 containerd[1508]: time="2025-09-12T22:30:02.342771217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:02.344225 containerd[1508]: time="2025-09-12T22:30:02.344197564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 22:30:02.345006 containerd[1508]: time="2025-09-12T22:30:02.344957658Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:02.348327 containerd[1508]: time="2025-09-12T22:30:02.347292981Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:02.348628 containerd[1508]: time="2025-09-12T22:30:02.348579405Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.281649704s" Sep 12 22:30:02.348628 containerd[1508]: time="2025-09-12T22:30:02.348613045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:30:02.351833 containerd[1508]: time="2025-09-12T22:30:02.351809624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:30:02.361522 containerd[1508]: time="2025-09-12T22:30:02.361490563Z" level=info msg="CreateContainer within sandbox \"b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:30:02.370352 containerd[1508]: time="2025-09-12T22:30:02.369402829Z" level=info msg="Container b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:02.382773 containerd[1508]: time="2025-09-12T22:30:02.382725715Z" level=info msg="CreateContainer within sandbox \"b1e412a66e664fa516f725d107d93d28673fdeb57a405bf9668de31bb2098cb4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1\"" Sep 12 22:30:02.388686 containerd[1508]: time="2025-09-12T22:30:02.388650584Z" level=info msg="StartContainer for \"b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1\"" Sep 12 22:30:02.389657 containerd[1508]: time="2025-09-12T22:30:02.389625842Z" level=info msg="connecting to shim b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1" address="unix:///run/containerd/s/1f362ccf291c757e3f54debe2ae11d84886cf4a3092abf8801ce8cf2036616c5" protocol=ttrpc version=3 Sep 12 22:30:02.408882 systemd[1]: Started cri-containerd-b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1.scope - libcontainer container b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1. Sep 12 22:30:02.439858 systemd-networkd[1437]: cali1fedc87b2d2: Gained IPv6LL Sep 12 22:30:02.509708 containerd[1508]: time="2025-09-12T22:30:02.509608298Z" level=info msg="StartContainer for \"b96351a1a614a72aac19b9611fb00474638de4069f2d5a0d56da47d4e437dcd1\" returns successfully" Sep 12 22:30:02.624780 containerd[1508]: time="2025-09-12T22:30:02.624178693Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:02.626015 containerd[1508]: time="2025-09-12T22:30:02.625985527Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:30:02.635578 containerd[1508]: time="2025-09-12T22:30:02.635540983Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 283.698198ms" Sep 12 22:30:02.638734 containerd[1508]: time="2025-09-12T22:30:02.638704482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:30:02.641267 containerd[1508]: time="2025-09-12T22:30:02.641239728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:30:02.643879 containerd[1508]: time="2025-09-12T22:30:02.643826816Z" level=info msg="CreateContainer within sandbox \"6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:30:02.660852 containerd[1508]: time="2025-09-12T22:30:02.660807370Z" level=info msg="Container 607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:02.668242 containerd[1508]: time="2025-09-12T22:30:02.668134065Z" level=info msg="CreateContainer within sandbox \"6eaaed62d371e290f8c73c52dab6608a1519ab2d38e4aab6c8e949172b9aaba7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f\"" Sep 12 22:30:02.669785 containerd[1508]: time="2025-09-12T22:30:02.668978001Z" level=info msg="StartContainer for \"607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f\"" Sep 12 22:30:02.670242 containerd[1508]: time="2025-09-12T22:30:02.670212783Z" level=info msg="connecting to shim 607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f" address="unix:///run/containerd/s/f7f6610ee7abec4727b879e7ce87509b561e4d82e586e5357ca47894072b5376" protocol=ttrpc version=3 Sep 12 22:30:02.695841 systemd[1]: Started cri-containerd-607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f.scope - libcontainer container 607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f. Sep 12 22:30:02.746196 containerd[1508]: time="2025-09-12T22:30:02.745883021Z" level=info msg="StartContainer for \"607e4235316910acaf18e6bed36475aa0533baf24e0e01acd8b19ba3083ba17f\" returns successfully" Sep 12 22:30:02.774661 kubelet[2658]: I0912 22:30:02.774532 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57f645bfdb-zb94s" podStartSLOduration=24.289346403 podStartE2EDuration="27.774517509s" podCreationTimestamp="2025-09-12 22:29:35 +0000 UTC" firstStartedPulling="2025-09-12 22:29:58.866373273 +0000 UTC m=+39.424631974" lastFinishedPulling="2025-09-12 22:30:02.351544379 +0000 UTC m=+42.909803080" observedRunningTime="2025-09-12 22:30:02.761860116 +0000 UTC m=+43.320118857" watchObservedRunningTime="2025-09-12 22:30:02.774517509 +0000 UTC m=+43.332776250" Sep 12 22:30:03.080409 systemd-networkd[1437]: calie497eaa66f5: Gained IPv6LL Sep 12 22:30:03.754708 kubelet[2658]: I0912 22:30:03.753992 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:04.287568 containerd[1508]: time="2025-09-12T22:30:04.287377459Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:04.290386 containerd[1508]: time="2025-09-12T22:30:04.290082907Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 22:30:04.292987 containerd[1508]: time="2025-09-12T22:30:04.291910579Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:04.296699 containerd[1508]: time="2025-09-12T22:30:04.296509140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:04.297068 containerd[1508]: time="2025-09-12T22:30:04.297043669Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.655542576s" Sep 12 22:30:04.297119 containerd[1508]: time="2025-09-12T22:30:04.297075390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 22:30:04.300433 containerd[1508]: time="2025-09-12T22:30:04.300367847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:30:04.301646 containerd[1508]: time="2025-09-12T22:30:04.301606909Z" level=info msg="CreateContainer within sandbox \"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:30:04.312875 containerd[1508]: time="2025-09-12T22:30:04.312836227Z" level=info msg="Container 27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:04.344900 containerd[1508]: time="2025-09-12T22:30:04.344830229Z" level=info msg="CreateContainer within sandbox \"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa\"" Sep 12 22:30:04.346815 containerd[1508]: time="2025-09-12T22:30:04.346188333Z" level=info msg="StartContainer for \"27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa\"" Sep 12 22:30:04.348991 containerd[1508]: time="2025-09-12T22:30:04.348804259Z" level=info msg="connecting to shim 27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa" address="unix:///run/containerd/s/2de147cda23aa4f5d42aca6029e9b042b671c9db6e809afcab5e0f5ab24447fb" protocol=ttrpc version=3 Sep 12 22:30:04.382861 systemd[1]: Started cri-containerd-27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa.scope - libcontainer container 27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa. Sep 12 22:30:04.504404 containerd[1508]: time="2025-09-12T22:30:04.504357915Z" level=info msg="StartContainer for \"27cefac08ae6cdbffdc5c045cd38ec1c76c7aead6ad1352bd69b6399f54e3dfa\" returns successfully" Sep 12 22:30:04.758050 kubelet[2658]: I0912 22:30:04.757695 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:04.948025 systemd[1]: Started sshd@8-10.0.0.148:22-10.0.0.1:33494.service - OpenSSH per-connection server daemon (10.0.0.1:33494). Sep 12 22:30:05.025136 sshd[5094]: Accepted publickey for core from 10.0.0.1 port 33494 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:05.026997 sshd-session[5094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:05.032742 systemd-logind[1485]: New session 9 of user core. Sep 12 22:30:05.037846 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:30:05.287496 sshd[5097]: Connection closed by 10.0.0.1 port 33494 Sep 12 22:30:05.287363 sshd-session[5094]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:05.293604 systemd[1]: sshd@8-10.0.0.148:22-10.0.0.1:33494.service: Deactivated successfully. Sep 12 22:30:05.295348 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:30:05.297842 systemd-logind[1485]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:30:05.299456 systemd-logind[1485]: Removed session 9. Sep 12 22:30:06.414073 containerd[1508]: time="2025-09-12T22:30:06.414016504Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:06.414833 containerd[1508]: time="2025-09-12T22:30:06.414799437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 22:30:06.415716 containerd[1508]: time="2025-09-12T22:30:06.415667892Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:06.417740 containerd[1508]: time="2025-09-12T22:30:06.417707406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:06.418522 containerd[1508]: time="2025-09-12T22:30:06.418483899Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.118082491s" Sep 12 22:30:06.418522 containerd[1508]: time="2025-09-12T22:30:06.418517300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 22:30:06.419666 containerd[1508]: time="2025-09-12T22:30:06.419475396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:30:06.430008 containerd[1508]: time="2025-09-12T22:30:06.429951692Z" level=info msg="CreateContainer within sandbox \"28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:30:06.441288 containerd[1508]: time="2025-09-12T22:30:06.441244202Z" level=info msg="Container 19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:06.456269 containerd[1508]: time="2025-09-12T22:30:06.456224774Z" level=info msg="CreateContainer within sandbox \"28101259e35f7af7d95bc0446414b44826191075ee68b3bff6e1ce75171f350f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\"" Sep 12 22:30:06.456942 containerd[1508]: time="2025-09-12T22:30:06.456878345Z" level=info msg="StartContainer for \"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\"" Sep 12 22:30:06.457951 containerd[1508]: time="2025-09-12T22:30:06.457926842Z" level=info msg="connecting to shim 19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b" address="unix:///run/containerd/s/257f494e54e8717a2d174b2157f30a041f9cfa2a05c58c4b564acf1564461ff1" protocol=ttrpc version=3 Sep 12 22:30:06.478845 systemd[1]: Started cri-containerd-19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b.scope - libcontainer container 19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b. Sep 12 22:30:06.525111 containerd[1508]: time="2025-09-12T22:30:06.524665485Z" level=info msg="StartContainer for \"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\" returns successfully" Sep 12 22:30:06.787783 kubelet[2658]: I0912 22:30:06.787651 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57f645bfdb-m9hn4" podStartSLOduration=28.099150508 podStartE2EDuration="31.787634547s" podCreationTimestamp="2025-09-12 22:29:35 +0000 UTC" firstStartedPulling="2025-09-12 22:29:58.951079938 +0000 UTC m=+39.509338639" lastFinishedPulling="2025-09-12 22:30:02.639563937 +0000 UTC m=+43.197822678" observedRunningTime="2025-09-12 22:30:03.767827463 +0000 UTC m=+44.326086204" watchObservedRunningTime="2025-09-12 22:30:06.787634547 +0000 UTC m=+47.345893288" Sep 12 22:30:06.788965 kubelet[2658]: I0912 22:30:06.788884 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-76fdc566d5-xfzhc" podStartSLOduration=23.217331459 podStartE2EDuration="27.788873768s" podCreationTimestamp="2025-09-12 22:29:39 +0000 UTC" firstStartedPulling="2025-09-12 22:30:01.847781644 +0000 UTC m=+42.406040385" lastFinishedPulling="2025-09-12 22:30:06.419323953 +0000 UTC m=+46.977582694" observedRunningTime="2025-09-12 22:30:06.786392286 +0000 UTC m=+47.344651027" watchObservedRunningTime="2025-09-12 22:30:06.788873768 +0000 UTC m=+47.347132509" Sep 12 22:30:07.777166 kubelet[2658]: I0912 22:30:07.777124 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:07.951408 containerd[1508]: time="2025-09-12T22:30:07.951086019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:07.951864 containerd[1508]: time="2025-09-12T22:30:07.951833151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 22:30:07.952512 containerd[1508]: time="2025-09-12T22:30:07.952481562Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:07.954540 containerd[1508]: time="2025-09-12T22:30:07.954500235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:30:07.955103 containerd[1508]: time="2025-09-12T22:30:07.955070325Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.535565089s" Sep 12 22:30:07.955161 containerd[1508]: time="2025-09-12T22:30:07.955103685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 22:30:07.957083 containerd[1508]: time="2025-09-12T22:30:07.957061038Z" level=info msg="CreateContainer within sandbox \"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:30:07.963690 containerd[1508]: time="2025-09-12T22:30:07.962861453Z" level=info msg="Container 7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:30:07.980666 containerd[1508]: time="2025-09-12T22:30:07.980638346Z" level=info msg="CreateContainer within sandbox \"193b1baf32839605c92a0be24969662d62c29040ecae25d54eadc7d8fc14d8e5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f\"" Sep 12 22:30:07.981162 containerd[1508]: time="2025-09-12T22:30:07.981134874Z" level=info msg="StartContainer for \"7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f\"" Sep 12 22:30:07.982523 containerd[1508]: time="2025-09-12T22:30:07.982501496Z" level=info msg="connecting to shim 7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f" address="unix:///run/containerd/s/2de147cda23aa4f5d42aca6029e9b042b671c9db6e809afcab5e0f5ab24447fb" protocol=ttrpc version=3 Sep 12 22:30:08.014877 systemd[1]: Started cri-containerd-7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f.scope - libcontainer container 7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f. Sep 12 22:30:08.126999 containerd[1508]: time="2025-09-12T22:30:08.126837472Z" level=info msg="StartContainer for \"7fc36ce8d72c10ca667c93ad74266f95f4d155cd5a215ed5ac2a944b0bfd403f\" returns successfully" Sep 12 22:30:08.608506 kubelet[2658]: I0912 22:30:08.608448 2658 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:30:08.608506 kubelet[2658]: I0912 22:30:08.608493 2658 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:30:08.804504 kubelet[2658]: I0912 22:30:08.804385 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-69gts" podStartSLOduration=22.858894908 podStartE2EDuration="30.804368686s" podCreationTimestamp="2025-09-12 22:29:38 +0000 UTC" firstStartedPulling="2025-09-12 22:30:00.010441641 +0000 UTC m=+40.568700382" lastFinishedPulling="2025-09-12 22:30:07.955915419 +0000 UTC m=+48.514174160" observedRunningTime="2025-09-12 22:30:08.802947543 +0000 UTC m=+49.361206284" watchObservedRunningTime="2025-09-12 22:30:08.804368686 +0000 UTC m=+49.362627427" Sep 12 22:30:09.947420 kubelet[2658]: I0912 22:30:09.947321 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:10.019743 containerd[1508]: time="2025-09-12T22:30:10.019696021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\" id:\"8628bc920ddf7dbfc8949f63f21d3ca1b262c879ea20c2d290073172aa80eb57\" pid:5217 exited_at:{seconds:1757716210 nanos:12016902}" Sep 12 22:30:10.093490 containerd[1508]: time="2025-09-12T22:30:10.093115162Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\" id:\"53fb3feddd7bac2d5b4f478382ee1556f39836345fe8d01219e5b6d680b95d2d\" pid:5239 exited_at:{seconds:1757716210 nanos:92869718}" Sep 12 22:30:10.299250 systemd[1]: Started sshd@9-10.0.0.148:22-10.0.0.1:37568.service - OpenSSH per-connection server daemon (10.0.0.1:37568). Sep 12 22:30:10.361239 sshd[5251]: Accepted publickey for core from 10.0.0.1 port 37568 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:10.363014 sshd-session[5251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:10.367024 systemd-logind[1485]: New session 10 of user core. Sep 12 22:30:10.379876 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:30:10.615023 sshd[5254]: Connection closed by 10.0.0.1 port 37568 Sep 12 22:30:10.615899 sshd-session[5251]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:10.626999 systemd[1]: sshd@9-10.0.0.148:22-10.0.0.1:37568.service: Deactivated successfully. Sep 12 22:30:10.628773 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:30:10.629548 systemd-logind[1485]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:30:10.631630 systemd[1]: Started sshd@10-10.0.0.148:22-10.0.0.1:37584.service - OpenSSH per-connection server daemon (10.0.0.1:37584). Sep 12 22:30:10.633160 systemd-logind[1485]: Removed session 10. Sep 12 22:30:10.691944 sshd[5268]: Accepted publickey for core from 10.0.0.1 port 37584 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:10.693837 sshd-session[5268]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:10.698160 systemd-logind[1485]: New session 11 of user core. Sep 12 22:30:10.704837 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:30:10.941382 sshd[5271]: Connection closed by 10.0.0.1 port 37584 Sep 12 22:30:10.941468 sshd-session[5268]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:10.954095 systemd[1]: sshd@10-10.0.0.148:22-10.0.0.1:37584.service: Deactivated successfully. Sep 12 22:30:10.957369 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:30:10.959798 systemd-logind[1485]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:30:10.965852 systemd[1]: Started sshd@11-10.0.0.148:22-10.0.0.1:37600.service - OpenSSH per-connection server daemon (10.0.0.1:37600). Sep 12 22:30:10.967773 systemd-logind[1485]: Removed session 11. Sep 12 22:30:11.026123 sshd[5282]: Accepted publickey for core from 10.0.0.1 port 37600 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:11.027766 sshd-session[5282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:11.033291 systemd-logind[1485]: New session 12 of user core. Sep 12 22:30:11.041964 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:30:11.220997 sshd[5285]: Connection closed by 10.0.0.1 port 37600 Sep 12 22:30:11.221258 sshd-session[5282]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:11.225573 systemd-logind[1485]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:30:11.226595 systemd[1]: sshd@11-10.0.0.148:22-10.0.0.1:37600.service: Deactivated successfully. Sep 12 22:30:11.228397 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:30:11.230180 systemd-logind[1485]: Removed session 12. Sep 12 22:30:16.234916 systemd[1]: Started sshd@12-10.0.0.148:22-10.0.0.1:37612.service - OpenSSH per-connection server daemon (10.0.0.1:37612). Sep 12 22:30:16.281298 sshd[5311]: Accepted publickey for core from 10.0.0.1 port 37612 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:16.282644 sshd-session[5311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:16.286770 systemd-logind[1485]: New session 13 of user core. Sep 12 22:30:16.294824 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:30:16.443723 sshd[5314]: Connection closed by 10.0.0.1 port 37612 Sep 12 22:30:16.445840 sshd-session[5311]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:16.449612 systemd[1]: sshd@12-10.0.0.148:22-10.0.0.1:37612.service: Deactivated successfully. Sep 12 22:30:16.451994 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:30:16.456689 systemd-logind[1485]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:30:16.457809 systemd-logind[1485]: Removed session 13. Sep 12 22:30:17.699980 containerd[1508]: time="2025-09-12T22:30:17.699818777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\" id:\"b668fefbf0bf5167ee729a49d5aa1e14704794de44e9af937348a9e3e8c02583\" pid:5338 exited_at:{seconds:1757716217 nanos:699527413}" Sep 12 22:30:21.455408 systemd[1]: Started sshd@13-10.0.0.148:22-10.0.0.1:38746.service - OpenSSH per-connection server daemon (10.0.0.1:38746). Sep 12 22:30:21.529375 sshd[5355]: Accepted publickey for core from 10.0.0.1 port 38746 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:21.531049 sshd-session[5355]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:21.537354 systemd-logind[1485]: New session 14 of user core. Sep 12 22:30:21.541831 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:30:21.795246 sshd[5358]: Connection closed by 10.0.0.1 port 38746 Sep 12 22:30:21.795841 sshd-session[5355]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:21.801229 systemd-logind[1485]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:30:21.801366 systemd[1]: sshd@13-10.0.0.148:22-10.0.0.1:38746.service: Deactivated successfully. Sep 12 22:30:21.803191 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:30:21.806195 systemd-logind[1485]: Removed session 14. Sep 12 22:30:22.459253 kubelet[2658]: I0912 22:30:22.459199 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:26.206494 containerd[1508]: time="2025-09-12T22:30:26.206452854Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\" id:\"33b2f484dabca317646034ea03ebafd1232488e4260a242cda9c4139f5ec3e5a\" pid:5388 exited_at:{seconds:1757716226 nanos:205955408}" Sep 12 22:30:26.810005 systemd[1]: Started sshd@14-10.0.0.148:22-10.0.0.1:38752.service - OpenSSH per-connection server daemon (10.0.0.1:38752). Sep 12 22:30:26.876599 sshd[5404]: Accepted publickey for core from 10.0.0.1 port 38752 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:26.877894 sshd-session[5404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:26.883604 systemd-logind[1485]: New session 15 of user core. Sep 12 22:30:26.894832 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:30:27.046918 sshd[5407]: Connection closed by 10.0.0.1 port 38752 Sep 12 22:30:27.047542 sshd-session[5404]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:27.051340 systemd[1]: sshd@14-10.0.0.148:22-10.0.0.1:38752.service: Deactivated successfully. Sep 12 22:30:27.053179 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:30:27.053865 systemd-logind[1485]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:30:27.055192 systemd-logind[1485]: Removed session 15. Sep 12 22:30:28.593939 kubelet[2658]: I0912 22:30:28.593732 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:30:32.058620 systemd[1]: Started sshd@15-10.0.0.148:22-10.0.0.1:35516.service - OpenSSH per-connection server daemon (10.0.0.1:35516). Sep 12 22:30:32.126107 sshd[5431]: Accepted publickey for core from 10.0.0.1 port 35516 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:32.127837 sshd-session[5431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:32.131827 systemd-logind[1485]: New session 16 of user core. Sep 12 22:30:32.144836 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:30:32.329811 sshd[5434]: Connection closed by 10.0.0.1 port 35516 Sep 12 22:30:32.330261 sshd-session[5431]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:32.342231 systemd[1]: sshd@15-10.0.0.148:22-10.0.0.1:35516.service: Deactivated successfully. Sep 12 22:30:32.345055 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:30:32.346439 systemd-logind[1485]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:30:32.348661 systemd[1]: Started sshd@16-10.0.0.148:22-10.0.0.1:35526.service - OpenSSH per-connection server daemon (10.0.0.1:35526). Sep 12 22:30:32.349361 systemd-logind[1485]: Removed session 16. Sep 12 22:30:32.409204 sshd[5448]: Accepted publickey for core from 10.0.0.1 port 35526 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:32.410293 sshd-session[5448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:32.413955 systemd-logind[1485]: New session 17 of user core. Sep 12 22:30:32.423850 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:30:32.635099 sshd[5451]: Connection closed by 10.0.0.1 port 35526 Sep 12 22:30:32.635430 sshd-session[5448]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:32.643263 systemd[1]: sshd@16-10.0.0.148:22-10.0.0.1:35526.service: Deactivated successfully. Sep 12 22:30:32.645497 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:30:32.646239 systemd-logind[1485]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:30:32.648442 systemd[1]: Started sshd@17-10.0.0.148:22-10.0.0.1:35536.service - OpenSSH per-connection server daemon (10.0.0.1:35536). Sep 12 22:30:32.649280 systemd-logind[1485]: Removed session 17. Sep 12 22:30:32.728798 sshd[5462]: Accepted publickey for core from 10.0.0.1 port 35536 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:32.730216 sshd-session[5462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:32.734358 systemd-logind[1485]: New session 18 of user core. Sep 12 22:30:32.739809 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:30:34.476761 sshd[5465]: Connection closed by 10.0.0.1 port 35536 Sep 12 22:30:34.477289 sshd-session[5462]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:34.486945 systemd[1]: sshd@17-10.0.0.148:22-10.0.0.1:35536.service: Deactivated successfully. Sep 12 22:30:34.489096 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:30:34.489487 systemd[1]: session-18.scope: Consumed 536ms CPU time, 74.9M memory peak. Sep 12 22:30:34.490082 systemd-logind[1485]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:30:34.497956 systemd[1]: Started sshd@18-10.0.0.148:22-10.0.0.1:35540.service - OpenSSH per-connection server daemon (10.0.0.1:35540). Sep 12 22:30:34.498511 systemd-logind[1485]: Removed session 18. Sep 12 22:30:34.562737 sshd[5482]: Accepted publickey for core from 10.0.0.1 port 35540 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:34.563996 sshd-session[5482]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:34.569689 systemd-logind[1485]: New session 19 of user core. Sep 12 22:30:34.576844 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:30:34.893016 sshd[5487]: Connection closed by 10.0.0.1 port 35540 Sep 12 22:30:34.893857 sshd-session[5482]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:34.907866 systemd[1]: sshd@18-10.0.0.148:22-10.0.0.1:35540.service: Deactivated successfully. Sep 12 22:30:34.911697 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:30:34.913526 systemd-logind[1485]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:30:34.916954 systemd[1]: Started sshd@19-10.0.0.148:22-10.0.0.1:35554.service - OpenSSH per-connection server daemon (10.0.0.1:35554). Sep 12 22:30:34.918396 systemd-logind[1485]: Removed session 19. Sep 12 22:30:34.997803 sshd[5498]: Accepted publickey for core from 10.0.0.1 port 35554 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:34.999103 sshd-session[5498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:35.006183 systemd-logind[1485]: New session 20 of user core. Sep 12 22:30:35.015881 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:30:35.200183 sshd[5501]: Connection closed by 10.0.0.1 port 35554 Sep 12 22:30:35.200932 sshd-session[5498]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:35.205393 systemd[1]: sshd@19-10.0.0.148:22-10.0.0.1:35554.service: Deactivated successfully. Sep 12 22:30:35.207136 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:30:35.209497 systemd-logind[1485]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:30:35.210793 systemd-logind[1485]: Removed session 20. Sep 12 22:30:39.979533 containerd[1508]: time="2025-09-12T22:30:39.979351533Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\" id:\"395c70aad4e2bdcdd37ffedd2ee099bc72ea568f060fd2e6cfc725819b3ca733\" pid:5526 exited_at:{seconds:1757716239 nanos:979108410}" Sep 12 22:30:40.214866 systemd[1]: Started sshd@20-10.0.0.148:22-10.0.0.1:34488.service - OpenSSH per-connection server daemon (10.0.0.1:34488). Sep 12 22:30:40.267770 sshd[5540]: Accepted publickey for core from 10.0.0.1 port 34488 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:40.268798 sshd-session[5540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:40.272437 systemd-logind[1485]: New session 21 of user core. Sep 12 22:30:40.283808 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 22:30:40.402148 sshd[5543]: Connection closed by 10.0.0.1 port 34488 Sep 12 22:30:40.402487 sshd-session[5540]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:40.406209 systemd[1]: sshd@20-10.0.0.148:22-10.0.0.1:34488.service: Deactivated successfully. Sep 12 22:30:40.408608 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 22:30:40.409878 systemd-logind[1485]: Session 21 logged out. Waiting for processes to exit. Sep 12 22:30:40.411614 systemd-logind[1485]: Removed session 21. Sep 12 22:30:45.413758 systemd[1]: Started sshd@21-10.0.0.148:22-10.0.0.1:34502.service - OpenSSH per-connection server daemon (10.0.0.1:34502). Sep 12 22:30:45.472068 sshd[5559]: Accepted publickey for core from 10.0.0.1 port 34502 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:45.473544 sshd-session[5559]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:45.482436 systemd-logind[1485]: New session 22 of user core. Sep 12 22:30:45.487834 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 22:30:45.615987 sshd[5562]: Connection closed by 10.0.0.1 port 34502 Sep 12 22:30:45.616311 sshd-session[5559]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:45.619597 systemd[1]: sshd@21-10.0.0.148:22-10.0.0.1:34502.service: Deactivated successfully. Sep 12 22:30:45.621305 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 22:30:45.622101 systemd-logind[1485]: Session 22 logged out. Waiting for processes to exit. Sep 12 22:30:45.623185 systemd-logind[1485]: Removed session 22. Sep 12 22:30:47.626382 containerd[1508]: time="2025-09-12T22:30:47.626292259Z" level=info msg="TaskExit event in podsandbox handler container_id:\"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\" id:\"9df562c49648c50da43b1c5019166c6835a445070ff03cba814bb5eb31e9bb49\" pid:5587 exited_at:{seconds:1757716247 nanos:625958136}" Sep 12 22:30:50.630917 systemd[1]: Started sshd@22-10.0.0.148:22-10.0.0.1:32832.service - OpenSSH per-connection server daemon (10.0.0.1:32832). Sep 12 22:30:50.692482 sshd[5599]: Accepted publickey for core from 10.0.0.1 port 32832 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:50.694007 sshd-session[5599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:50.698574 systemd-logind[1485]: New session 23 of user core. Sep 12 22:30:50.710941 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 22:30:50.829809 sshd[5602]: Connection closed by 10.0.0.1 port 32832 Sep 12 22:30:50.830321 sshd-session[5599]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:50.833720 systemd[1]: sshd@22-10.0.0.148:22-10.0.0.1:32832.service: Deactivated successfully. Sep 12 22:30:50.835454 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 22:30:50.836252 systemd-logind[1485]: Session 23 logged out. Waiting for processes to exit. Sep 12 22:30:50.837845 systemd-logind[1485]: Removed session 23. Sep 12 22:30:53.831241 containerd[1508]: time="2025-09-12T22:30:53.831197740Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19e03a4bd343cd0b3fb0bdad84c6b0855c29b2dbb26e3a427c4e46884cf97c5b\" id:\"c7ce80aa6f8a280043a969ca163978e89a1fb4bd63858b700b6d63170d61ac3e\" pid:5627 exited_at:{seconds:1757716253 nanos:830614174}" Sep 12 22:30:54.935962 containerd[1508]: time="2025-09-12T22:30:54.935916117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"749073db026802d3dab2cd9e3cb9863e8e74300f946db914aa638e99720e84fb\" id:\"751ec93474c74a25ebbd295afb583daab631bd8286ab348ea2b1b81486910cdd\" pid:5650 exited_at:{seconds:1757716254 nanos:934363060}" Sep 12 22:30:55.852893 systemd[1]: Started sshd@23-10.0.0.148:22-10.0.0.1:32848.service - OpenSSH per-connection server daemon (10.0.0.1:32848). Sep 12 22:30:55.898046 sshd[5664]: Accepted publickey for core from 10.0.0.1 port 32848 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:30:55.899062 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:30:55.902739 systemd-logind[1485]: New session 24 of user core. Sep 12 22:30:55.917835 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 22:30:56.040505 sshd[5667]: Connection closed by 10.0.0.1 port 32848 Sep 12 22:30:56.040846 sshd-session[5664]: pam_unix(sshd:session): session closed for user core Sep 12 22:30:56.044751 systemd[1]: sshd@23-10.0.0.148:22-10.0.0.1:32848.service: Deactivated successfully. Sep 12 22:30:56.046948 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 22:30:56.048236 systemd-logind[1485]: Session 24 logged out. Waiting for processes to exit. Sep 12 22:30:56.049595 systemd-logind[1485]: Removed session 24. Sep 12 22:30:56.164423 containerd[1508]: time="2025-09-12T22:30:56.163979882Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f303bb19fa771f4e8b4bff1be2a0300467421f9cc178b582e74d24806ee90bd\" id:\"c5702848d8217f73718db64da2959a1975cc788d430d95990f6186e837fc583a\" pid:5691 exited_at:{seconds:1757716256 nanos:163710199}"