Sep 12 22:10:38.749562 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 22:10:38.749585 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 20:38:46 -00 2025 Sep 12 22:10:38.749595 kernel: KASLR enabled Sep 12 22:10:38.749600 kernel: efi: EFI v2.7 by EDK II Sep 12 22:10:38.749606 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb21fd18 Sep 12 22:10:38.749611 kernel: random: crng init done Sep 12 22:10:38.749618 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 12 22:10:38.749624 kernel: secureboot: Secure boot enabled Sep 12 22:10:38.749630 kernel: ACPI: Early table checksum verification disabled Sep 12 22:10:38.749637 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 12 22:10:38.749643 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 12 22:10:38.749649 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749655 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749661 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749669 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749676 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749683 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749689 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749695 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749710 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:10:38.749726 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 12 22:10:38.749738 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 22:10:38.749744 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:10:38.749750 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 12 22:10:38.749756 kernel: Zone ranges: Sep 12 22:10:38.749764 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:10:38.749770 kernel: DMA32 empty Sep 12 22:10:38.749776 kernel: Normal empty Sep 12 22:10:38.749782 kernel: Device empty Sep 12 22:10:38.749788 kernel: Movable zone start for each node Sep 12 22:10:38.749794 kernel: Early memory node ranges Sep 12 22:10:38.749801 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 12 22:10:38.749807 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 12 22:10:38.749813 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 12 22:10:38.749821 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 12 22:10:38.749830 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 12 22:10:38.749836 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 12 22:10:38.749844 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 12 22:10:38.749850 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 12 22:10:38.749857 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 12 22:10:38.749870 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:10:38.749881 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 12 22:10:38.749889 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 12 22:10:38.749898 kernel: psci: probing for conduit method from ACPI. Sep 12 22:10:38.749906 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 22:10:38.749990 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 22:10:38.750000 kernel: psci: Trusted OS migration not required Sep 12 22:10:38.750006 kernel: psci: SMC Calling Convention v1.1 Sep 12 22:10:38.750013 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 22:10:38.750019 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 22:10:38.750026 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 22:10:38.750033 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 12 22:10:38.750040 kernel: Detected PIPT I-cache on CPU0 Sep 12 22:10:38.750049 kernel: CPU features: detected: GIC system register CPU interface Sep 12 22:10:38.750055 kernel: CPU features: detected: Spectre-v4 Sep 12 22:10:38.750062 kernel: CPU features: detected: Spectre-BHB Sep 12 22:10:38.750068 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 22:10:38.750075 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 22:10:38.750082 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 22:10:38.750089 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 22:10:38.750095 kernel: alternatives: applying boot alternatives Sep 12 22:10:38.750103 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:10:38.750110 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:10:38.750117 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:10:38.750125 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:10:38.750132 kernel: Fallback order for Node 0: 0 Sep 12 22:10:38.750139 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 12 22:10:38.750145 kernel: Policy zone: DMA Sep 12 22:10:38.750152 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:10:38.750158 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 12 22:10:38.750165 kernel: software IO TLB: area num 4. Sep 12 22:10:38.750172 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 12 22:10:38.750178 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 12 22:10:38.750185 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 22:10:38.750192 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:10:38.750199 kernel: rcu: RCU event tracing is enabled. Sep 12 22:10:38.750216 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 22:10:38.750226 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:10:38.750233 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:10:38.750247 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:10:38.750254 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 22:10:38.750261 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:10:38.750268 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:10:38.750275 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 22:10:38.750281 kernel: GICv3: 256 SPIs implemented Sep 12 22:10:38.750287 kernel: GICv3: 0 Extended SPIs implemented Sep 12 22:10:38.750294 kernel: Root IRQ handler: gic_handle_irq Sep 12 22:10:38.750301 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 22:10:38.750308 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 12 22:10:38.750315 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 22:10:38.750321 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 22:10:38.750328 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 12 22:10:38.750335 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 12 22:10:38.750349 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 12 22:10:38.750356 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 12 22:10:38.750362 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:10:38.750369 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:10:38.750376 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 22:10:38.750383 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 22:10:38.750391 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 22:10:38.750397 kernel: arm-pv: using stolen time PV Sep 12 22:10:38.750405 kernel: Console: colour dummy device 80x25 Sep 12 22:10:38.750412 kernel: ACPI: Core revision 20240827 Sep 12 22:10:38.750419 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 22:10:38.750425 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:10:38.750432 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:10:38.750439 kernel: landlock: Up and running. Sep 12 22:10:38.750445 kernel: SELinux: Initializing. Sep 12 22:10:38.750454 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:10:38.750461 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:10:38.750468 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:10:38.750475 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:10:38.750482 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:10:38.750488 kernel: Remapping and enabling EFI services. Sep 12 22:10:38.750495 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:10:38.750501 kernel: Detected PIPT I-cache on CPU1 Sep 12 22:10:38.750508 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 22:10:38.750516 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 12 22:10:38.750528 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:10:38.750534 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 22:10:38.750543 kernel: Detected PIPT I-cache on CPU2 Sep 12 22:10:38.750550 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 12 22:10:38.750557 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 12 22:10:38.750564 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:10:38.750570 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 12 22:10:38.750578 kernel: Detected PIPT I-cache on CPU3 Sep 12 22:10:38.750586 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 12 22:10:38.750593 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 12 22:10:38.750600 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:10:38.750606 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 12 22:10:38.750614 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 22:10:38.750621 kernel: SMP: Total of 4 processors activated. Sep 12 22:10:38.750627 kernel: CPU: All CPU(s) started at EL1 Sep 12 22:10:38.750634 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 22:10:38.750641 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 22:10:38.750650 kernel: CPU features: detected: Common not Private translations Sep 12 22:10:38.750657 kernel: CPU features: detected: CRC32 instructions Sep 12 22:10:38.750664 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 22:10:38.750670 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 22:10:38.750680 kernel: CPU features: detected: LSE atomic instructions Sep 12 22:10:38.750689 kernel: CPU features: detected: Privileged Access Never Sep 12 22:10:38.750695 kernel: CPU features: detected: RAS Extension Support Sep 12 22:10:38.750708 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 22:10:38.750715 kernel: alternatives: applying system-wide alternatives Sep 12 22:10:38.750724 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 12 22:10:38.750731 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 12 22:10:38.750739 kernel: devtmpfs: initialized Sep 12 22:10:38.750746 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:10:38.750753 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 22:10:38.750760 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 22:10:38.750766 kernel: 0 pages in range for non-PLT usage Sep 12 22:10:38.750773 kernel: 508560 pages in range for PLT usage Sep 12 22:10:38.750780 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:10:38.750788 kernel: SMBIOS 3.0.0 present. Sep 12 22:10:38.750796 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 12 22:10:38.750803 kernel: DMI: Memory slots populated: 1/1 Sep 12 22:10:38.750810 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:10:38.750817 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 22:10:38.750824 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 22:10:38.750831 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 22:10:38.750838 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:10:38.750845 kernel: audit: type=2000 audit(0.024:1): state=initialized audit_enabled=0 res=1 Sep 12 22:10:38.750853 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:10:38.750860 kernel: cpuidle: using governor menu Sep 12 22:10:38.750867 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 22:10:38.750874 kernel: ASID allocator initialised with 32768 entries Sep 12 22:10:38.750885 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:10:38.750892 kernel: Serial: AMBA PL011 UART driver Sep 12 22:10:38.750900 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:10:38.750907 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:10:38.750922 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 22:10:38.750931 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 22:10:38.750947 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:10:38.750955 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:10:38.750962 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 22:10:38.750969 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 22:10:38.750976 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:10:38.750983 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:10:38.750989 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:10:38.750996 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:10:38.751006 kernel: ACPI: Interpreter enabled Sep 12 22:10:38.751018 kernel: ACPI: Using GIC for interrupt routing Sep 12 22:10:38.751025 kernel: ACPI: MCFG table detected, 1 entries Sep 12 22:10:38.751032 kernel: ACPI: CPU0 has been hot-added Sep 12 22:10:38.751039 kernel: ACPI: CPU1 has been hot-added Sep 12 22:10:38.751047 kernel: ACPI: CPU2 has been hot-added Sep 12 22:10:38.751054 kernel: ACPI: CPU3 has been hot-added Sep 12 22:10:38.751061 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 22:10:38.751068 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 22:10:38.751076 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 22:10:38.751224 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:10:38.751295 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 22:10:38.751359 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 22:10:38.751442 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 22:10:38.751501 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 22:10:38.751510 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 22:10:38.751519 kernel: PCI host bridge to bus 0000:00 Sep 12 22:10:38.751594 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 22:10:38.751649 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 22:10:38.751714 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 22:10:38.751771 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 22:10:38.751854 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:10:38.751957 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 22:10:38.752035 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 12 22:10:38.752097 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 12 22:10:38.752177 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 22:10:38.752243 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 12 22:10:38.752311 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 12 22:10:38.752377 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 12 22:10:38.752456 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 22:10:38.752513 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 22:10:38.752569 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 22:10:38.752578 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 22:10:38.752586 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 22:10:38.752594 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 22:10:38.752601 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 22:10:38.752608 kernel: iommu: Default domain type: Translated Sep 12 22:10:38.752617 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 22:10:38.752624 kernel: efivars: Registered efivars operations Sep 12 22:10:38.752631 kernel: vgaarb: loaded Sep 12 22:10:38.752638 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 22:10:38.752645 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:10:38.752652 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:10:38.752659 kernel: pnp: PnP ACPI init Sep 12 22:10:38.752737 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 22:10:38.752748 kernel: pnp: PnP ACPI: found 1 devices Sep 12 22:10:38.752757 kernel: NET: Registered PF_INET protocol family Sep 12 22:10:38.752764 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:10:38.752771 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 22:10:38.752778 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:10:38.752785 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:10:38.752792 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 22:10:38.752799 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 22:10:38.752806 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:10:38.752813 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:10:38.752822 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:10:38.752829 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:10:38.752836 kernel: kvm [1]: HYP mode not available Sep 12 22:10:38.752843 kernel: Initialise system trusted keyrings Sep 12 22:10:38.752849 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 22:10:38.752856 kernel: Key type asymmetric registered Sep 12 22:10:38.752863 kernel: Asymmetric key parser 'x509' registered Sep 12 22:10:38.752870 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 22:10:38.752876 kernel: io scheduler mq-deadline registered Sep 12 22:10:38.752885 kernel: io scheduler kyber registered Sep 12 22:10:38.752892 kernel: io scheduler bfq registered Sep 12 22:10:38.752899 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 22:10:38.752906 kernel: ACPI: button: Power Button [PWRB] Sep 12 22:10:38.752921 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 22:10:38.753007 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 12 22:10:38.753016 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:10:38.753023 kernel: thunder_xcv, ver 1.0 Sep 12 22:10:38.753030 kernel: thunder_bgx, ver 1.0 Sep 12 22:10:38.753040 kernel: nicpf, ver 1.0 Sep 12 22:10:38.753047 kernel: nicvf, ver 1.0 Sep 12 22:10:38.753119 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 22:10:38.753184 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T22:10:38 UTC (1757715038) Sep 12 22:10:38.753194 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:10:38.753201 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 22:10:38.753212 kernel: watchdog: NMI not fully supported Sep 12 22:10:38.753220 kernel: watchdog: Hard watchdog permanently disabled Sep 12 22:10:38.753229 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:10:38.753237 kernel: Segment Routing with IPv6 Sep 12 22:10:38.753244 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:10:38.753251 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:10:38.753258 kernel: Key type dns_resolver registered Sep 12 22:10:38.753265 kernel: registered taskstats version 1 Sep 12 22:10:38.753272 kernel: Loading compiled-in X.509 certificates Sep 12 22:10:38.753280 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 2d7730e6d35b3fbd1c590cd72a2500b2380c020e' Sep 12 22:10:38.753287 kernel: Demotion targets for Node 0: null Sep 12 22:10:38.753296 kernel: Key type .fscrypt registered Sep 12 22:10:38.753303 kernel: Key type fscrypt-provisioning registered Sep 12 22:10:38.753310 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:10:38.753317 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:10:38.753324 kernel: ima: No architecture policies found Sep 12 22:10:38.753331 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 22:10:38.753338 kernel: clk: Disabling unused clocks Sep 12 22:10:38.753349 kernel: PM: genpd: Disabling unused power domains Sep 12 22:10:38.753356 kernel: Warning: unable to open an initial console. Sep 12 22:10:38.753366 kernel: Freeing unused kernel memory: 38976K Sep 12 22:10:38.753373 kernel: Run /init as init process Sep 12 22:10:38.753380 kernel: with arguments: Sep 12 22:10:38.753386 kernel: /init Sep 12 22:10:38.753393 kernel: with environment: Sep 12 22:10:38.753405 kernel: HOME=/ Sep 12 22:10:38.753412 kernel: TERM=linux Sep 12 22:10:38.753419 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:10:38.753427 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:10:38.753438 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:10:38.753446 systemd[1]: Detected virtualization kvm. Sep 12 22:10:38.753454 systemd[1]: Detected architecture arm64. Sep 12 22:10:38.753461 systemd[1]: Running in initrd. Sep 12 22:10:38.753468 systemd[1]: No hostname configured, using default hostname. Sep 12 22:10:38.753476 systemd[1]: Hostname set to . Sep 12 22:10:38.753483 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:10:38.753492 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:10:38.753499 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:10:38.753507 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:10:38.753515 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:10:38.753523 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:10:38.753531 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:10:38.753539 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:10:38.753549 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:10:38.753556 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:10:38.753564 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:10:38.753571 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:10:38.753579 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:10:38.753586 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:10:38.753593 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:10:38.753601 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:10:38.753610 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:10:38.753617 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:10:38.753625 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:10:38.753632 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:10:38.753640 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:10:38.753647 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:10:38.753655 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:10:38.753662 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:10:38.753670 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:10:38.753678 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:10:38.753686 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:10:38.753694 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:10:38.753708 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:10:38.753715 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:10:38.753723 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:10:38.753730 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:10:38.753738 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:10:38.753747 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:10:38.753755 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:10:38.753763 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:10:38.753787 systemd-journald[245]: Collecting audit messages is disabled. Sep 12 22:10:38.753808 systemd-journald[245]: Journal started Sep 12 22:10:38.753826 systemd-journald[245]: Runtime Journal (/run/log/journal/b23b047d629d48999a6b6d45175643bc) is 6M, max 48.5M, 42.4M free. Sep 12 22:10:38.759025 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:10:38.759060 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:10:38.745257 systemd-modules-load[246]: Inserted module 'overlay' Sep 12 22:10:38.763225 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 12 22:10:38.764660 kernel: Bridge firewalling registered Sep 12 22:10:38.764678 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:10:38.765802 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:10:38.769795 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:10:38.772431 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:10:38.784476 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:10:38.785634 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:10:38.788861 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:10:38.796357 systemd-tmpfiles[268]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:10:38.796584 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:10:38.800116 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:10:38.805013 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:10:38.805932 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:10:38.812029 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:10:38.815037 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:10:38.836637 dracut-cmdline[292]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:10:38.839240 systemd-resolved[287]: Positive Trust Anchors: Sep 12 22:10:38.839251 systemd-resolved[287]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:10:38.839281 systemd-resolved[287]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:10:38.844296 systemd-resolved[287]: Defaulting to hostname 'linux'. Sep 12 22:10:38.845486 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:10:38.852226 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:10:38.917946 kernel: SCSI subsystem initialized Sep 12 22:10:38.922941 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:10:38.929940 kernel: iscsi: registered transport (tcp) Sep 12 22:10:38.942950 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:10:38.943009 kernel: QLogic iSCSI HBA Driver Sep 12 22:10:38.961077 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:10:38.979483 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:10:38.981059 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:10:39.032183 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:10:39.034415 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:10:39.098951 kernel: raid6: neonx8 gen() 15725 MB/s Sep 12 22:10:39.115940 kernel: raid6: neonx4 gen() 15777 MB/s Sep 12 22:10:39.132936 kernel: raid6: neonx2 gen() 13243 MB/s Sep 12 22:10:39.149937 kernel: raid6: neonx1 gen() 10429 MB/s Sep 12 22:10:39.166934 kernel: raid6: int64x8 gen() 6896 MB/s Sep 12 22:10:39.183973 kernel: raid6: int64x4 gen() 7146 MB/s Sep 12 22:10:39.200941 kernel: raid6: int64x2 gen() 6092 MB/s Sep 12 22:10:39.217953 kernel: raid6: int64x1 gen() 5047 MB/s Sep 12 22:10:39.218000 kernel: raid6: using algorithm neonx4 gen() 15777 MB/s Sep 12 22:10:39.234969 kernel: raid6: .... xor() 12315 MB/s, rmw enabled Sep 12 22:10:39.235020 kernel: raid6: using neon recovery algorithm Sep 12 22:10:39.240292 kernel: xor: measuring software checksum speed Sep 12 22:10:39.240326 kernel: 8regs : 20934 MB/sec Sep 12 22:10:39.240933 kernel: 32regs : 21676 MB/sec Sep 12 22:10:39.241935 kernel: arm64_neon : 25098 MB/sec Sep 12 22:10:39.241948 kernel: xor: using function: arm64_neon (25098 MB/sec) Sep 12 22:10:39.294937 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:10:39.301297 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:10:39.303652 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:10:39.329847 systemd-udevd[500]: Using default interface naming scheme 'v255'. Sep 12 22:10:39.333954 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:10:39.335626 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:10:39.359997 dracut-pre-trigger[507]: rd.md=0: removing MD RAID activation Sep 12 22:10:39.385372 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:10:39.387674 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:10:39.440167 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:10:39.445482 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:10:39.493973 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 12 22:10:39.502979 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 22:10:39.506571 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:10:39.506693 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:10:39.523681 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:10:39.523715 kernel: GPT:9289727 != 19775487 Sep 12 22:10:39.523726 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:10:39.523737 kernel: GPT:9289727 != 19775487 Sep 12 22:10:39.523748 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:10:39.523757 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:10:39.518044 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:10:39.523578 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:10:39.558362 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 22:10:39.559617 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:10:39.562957 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:10:39.569537 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 22:10:39.570621 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 22:10:39.579750 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 22:10:39.586971 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:10:39.587877 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:10:39.589748 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:10:39.591490 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:10:39.593653 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:10:39.595252 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:10:39.613946 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:10:39.614301 disk-uuid[594]: Primary Header is updated. Sep 12 22:10:39.614301 disk-uuid[594]: Secondary Entries is updated. Sep 12 22:10:39.614301 disk-uuid[594]: Secondary Header is updated. Sep 12 22:10:39.617468 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:10:40.629607 disk-uuid[600]: The operation has completed successfully. Sep 12 22:10:40.630671 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:10:40.653859 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:10:40.653995 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:10:40.681765 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:10:40.707130 sh[614]: Success Sep 12 22:10:40.720334 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:10:40.720379 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:10:40.720391 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:10:40.728941 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 22:10:40.753638 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:10:40.756235 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:10:40.772149 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:10:40.777046 kernel: BTRFS: device fsid 254e43f1-b609-42b8-bcc5-437252095415 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (626) Sep 12 22:10:40.777075 kernel: BTRFS info (device dm-0): first mount of filesystem 254e43f1-b609-42b8-bcc5-437252095415 Sep 12 22:10:40.777093 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:10:40.781935 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:10:40.781978 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:10:40.782509 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:10:40.783550 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:10:40.784616 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:10:40.785339 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:10:40.788007 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:10:40.809221 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (659) Sep 12 22:10:40.809267 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:10:40.809278 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:10:40.812936 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:10:40.812974 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:10:40.817945 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:10:40.819969 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:10:40.821577 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:10:40.887801 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:10:40.890707 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:10:40.923945 ignition[708]: Ignition 2.22.0 Sep 12 22:10:40.923958 ignition[708]: Stage: fetch-offline Sep 12 22:10:40.923987 ignition[708]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:40.923995 ignition[708]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:40.924074 ignition[708]: parsed url from cmdline: "" Sep 12 22:10:40.924077 ignition[708]: no config URL provided Sep 12 22:10:40.924082 ignition[708]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:10:40.924088 ignition[708]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:10:40.928614 systemd-networkd[804]: lo: Link UP Sep 12 22:10:40.924107 ignition[708]: op(1): [started] loading QEMU firmware config module Sep 12 22:10:40.928618 systemd-networkd[804]: lo: Gained carrier Sep 12 22:10:40.924112 ignition[708]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 22:10:40.929379 systemd-networkd[804]: Enumeration completed Sep 12 22:10:40.929529 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:10:40.931036 systemd[1]: Reached target network.target - Network. Sep 12 22:10:40.931909 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:10:40.931926 systemd-networkd[804]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:10:40.938614 ignition[708]: op(1): [finished] loading QEMU firmware config module Sep 12 22:10:40.932378 systemd-networkd[804]: eth0: Link UP Sep 12 22:10:40.932712 systemd-networkd[804]: eth0: Gained carrier Sep 12 22:10:40.932725 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:10:40.955964 systemd-networkd[804]: eth0: DHCPv4 address 10.0.0.61/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:10:40.987519 ignition[708]: parsing config with SHA512: 5bd0d0011565ff6160d2e802bc31883cdad8c1cf8b013b852d40425e0845210b76119b442245bfdd8324982e91fc286d490b012f714dc90421975ad85ce978f3 Sep 12 22:10:40.993309 unknown[708]: fetched base config from "system" Sep 12 22:10:40.993749 ignition[708]: fetch-offline: fetch-offline passed Sep 12 22:10:40.993320 unknown[708]: fetched user config from "qemu" Sep 12 22:10:40.993814 ignition[708]: Ignition finished successfully Sep 12 22:10:40.996319 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:10:40.998664 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 22:10:40.999535 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:10:41.033671 ignition[813]: Ignition 2.22.0 Sep 12 22:10:41.033708 ignition[813]: Stage: kargs Sep 12 22:10:41.033842 ignition[813]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:41.033851 ignition[813]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:41.034628 ignition[813]: kargs: kargs passed Sep 12 22:10:41.034674 ignition[813]: Ignition finished successfully Sep 12 22:10:41.037873 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:10:41.039868 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:10:41.071702 ignition[821]: Ignition 2.22.0 Sep 12 22:10:41.071717 ignition[821]: Stage: disks Sep 12 22:10:41.071854 ignition[821]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:41.071863 ignition[821]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:41.072673 ignition[821]: disks: disks passed Sep 12 22:10:41.074558 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:10:41.072733 ignition[821]: Ignition finished successfully Sep 12 22:10:41.075714 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:10:41.076858 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:10:41.078663 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:10:41.080076 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:10:41.081826 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:10:41.084628 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:10:41.111318 systemd-fsck[831]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 22:10:41.115579 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:10:41.117683 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:10:41.178943 kernel: EXT4-fs (vda9): mounted filesystem a7b592ec-3c41-4dc2-88a7-056c1f18b418 r/w with ordered data mode. Quota mode: none. Sep 12 22:10:41.179174 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:10:41.180423 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:10:41.182805 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:10:41.184614 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:10:41.185581 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 22:10:41.185623 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:10:41.185647 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:10:41.199859 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:10:41.202581 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:10:41.207349 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (839) Sep 12 22:10:41.207373 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:10:41.207383 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:10:41.209171 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:10:41.209204 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:10:41.211108 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:10:41.241320 initrd-setup-root[863]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:10:41.245957 initrd-setup-root[870]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:10:41.250195 initrd-setup-root[877]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:10:41.253962 initrd-setup-root[884]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:10:41.331291 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:10:41.333719 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:10:41.335579 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:10:41.353001 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:10:41.377294 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:10:41.393626 ignition[952]: INFO : Ignition 2.22.0 Sep 12 22:10:41.393626 ignition[952]: INFO : Stage: mount Sep 12 22:10:41.395163 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:41.395163 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:41.395163 ignition[952]: INFO : mount: mount passed Sep 12 22:10:41.395163 ignition[952]: INFO : Ignition finished successfully Sep 12 22:10:41.396476 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:10:41.400395 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:10:41.776115 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:10:41.777881 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:10:41.796597 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (965) Sep 12 22:10:41.796645 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:10:41.796656 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:10:41.799946 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:10:41.799980 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:10:41.801346 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:10:41.833427 ignition[982]: INFO : Ignition 2.22.0 Sep 12 22:10:41.833427 ignition[982]: INFO : Stage: files Sep 12 22:10:41.835145 ignition[982]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:41.835145 ignition[982]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:41.835145 ignition[982]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:10:41.838481 ignition[982]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:10:41.838481 ignition[982]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:10:41.838481 ignition[982]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:10:41.838481 ignition[982]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:10:41.838481 ignition[982]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:10:41.837613 unknown[982]: wrote ssh authorized keys file for user: core Sep 12 22:10:41.846125 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 22:10:41.846125 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 12 22:10:41.953212 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:10:42.078101 systemd-networkd[804]: eth0: Gained IPv6LL Sep 12 22:10:42.233657 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:10:42.235478 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 22:10:42.248621 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 12 22:10:42.589669 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:10:43.022486 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 22:10:43.022486 ignition[982]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:10:43.025975 ignition[982]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 22:10:43.027647 ignition[982]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 22:10:43.039397 ignition[982]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:10:43.042351 ignition[982]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:10:43.043659 ignition[982]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 22:10:43.043659 ignition[982]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:10:43.043659 ignition[982]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:10:43.043659 ignition[982]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:10:43.043659 ignition[982]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:10:43.043659 ignition[982]: INFO : files: files passed Sep 12 22:10:43.043659 ignition[982]: INFO : Ignition finished successfully Sep 12 22:10:43.045323 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:10:43.049813 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:10:43.052443 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:10:43.071782 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:10:43.071875 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:10:43.076066 initrd-setup-root-after-ignition[1011]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 22:10:43.077551 initrd-setup-root-after-ignition[1014]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:10:43.079016 initrd-setup-root-after-ignition[1014]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:10:43.080483 initrd-setup-root-after-ignition[1018]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:10:43.082611 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:10:43.084067 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:10:43.088061 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:10:43.121834 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:10:43.121985 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:10:43.123833 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:10:43.125469 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:10:43.126308 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:10:43.127049 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:10:43.149057 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:10:43.151131 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:10:43.173764 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:10:43.174974 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:10:43.176636 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:10:43.178012 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:10:43.178134 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:10:43.180171 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:10:43.181717 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:10:43.182979 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:10:43.184379 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:10:43.185831 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:10:43.187412 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:10:43.188882 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:10:43.190405 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:10:43.191932 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:10:43.193524 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:10:43.194946 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:10:43.196183 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:10:43.196393 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:10:43.199323 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:10:43.200742 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:10:43.203096 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:10:43.204661 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:10:43.205733 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:10:43.205849 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:10:43.208206 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:10:43.208372 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:10:43.210070 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:10:43.211605 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:10:43.211755 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:10:43.213583 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:10:43.214979 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:10:43.216601 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:10:43.216699 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:10:43.218625 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:10:43.218714 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:10:43.220186 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:10:43.220303 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:10:43.221912 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:10:43.222029 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:10:43.224335 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:10:43.226322 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:10:43.228134 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:10:43.228275 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:10:43.230188 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:10:43.230294 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:10:43.235121 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:10:43.239056 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:10:43.247578 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:10:43.255356 ignition[1038]: INFO : Ignition 2.22.0 Sep 12 22:10:43.255356 ignition[1038]: INFO : Stage: umount Sep 12 22:10:43.257854 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:10:43.257854 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:10:43.257854 ignition[1038]: INFO : umount: umount passed Sep 12 22:10:43.257854 ignition[1038]: INFO : Ignition finished successfully Sep 12 22:10:43.259026 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:10:43.259126 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:10:43.260960 systemd[1]: Stopped target network.target - Network. Sep 12 22:10:43.262267 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:10:43.262331 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:10:43.263815 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:10:43.263862 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:10:43.265416 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:10:43.265464 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:10:43.266884 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:10:43.266958 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:10:43.268700 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:10:43.270174 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:10:43.278441 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:10:43.278542 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:10:43.282354 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:10:43.282576 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:10:43.282654 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:10:43.285837 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:10:43.286698 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:10:43.289068 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:10:43.289107 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:10:43.292039 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:10:43.293563 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:10:43.293630 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:10:43.295568 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:10:43.295616 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:10:43.298492 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:10:43.298534 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:10:43.300437 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:10:43.300484 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:10:43.303647 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:10:43.307862 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:10:43.307937 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:10:43.308232 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:10:43.308316 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:10:43.310890 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:10:43.310983 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:10:43.317115 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:10:43.318057 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:10:43.319709 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:10:43.319748 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:10:43.321761 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:10:43.321800 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:10:43.323283 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:10:43.323324 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:10:43.325431 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:10:43.325477 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:10:43.327518 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:10:43.327563 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:10:43.330372 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:10:43.331755 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:10:43.331811 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:10:43.334420 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:10:43.334465 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:10:43.336970 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 22:10:43.337013 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:10:43.339512 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:10:43.339555 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:10:43.341212 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:10:43.341251 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:10:43.344627 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 22:10:43.344680 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 22:10:43.344726 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 22:10:43.344757 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:10:43.345005 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:10:43.346956 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:10:43.350010 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:10:43.350113 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:10:43.352023 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:10:43.353858 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:10:43.370369 systemd[1]: Switching root. Sep 12 22:10:43.405141 systemd-journald[245]: Journal stopped Sep 12 22:10:44.142438 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Sep 12 22:10:44.142486 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:10:44.142498 kernel: SELinux: policy capability open_perms=1 Sep 12 22:10:44.142511 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:10:44.142525 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:10:44.142537 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:10:44.142550 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:10:44.142561 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:10:44.142574 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:10:44.142583 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:10:44.142598 kernel: audit: type=1403 audit(1757715043.577:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:10:44.142608 systemd[1]: Successfully loaded SELinux policy in 54.484ms. Sep 12 22:10:44.142625 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.298ms. Sep 12 22:10:44.142636 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:10:44.142647 systemd[1]: Detected virtualization kvm. Sep 12 22:10:44.142657 systemd[1]: Detected architecture arm64. Sep 12 22:10:44.142667 systemd[1]: Detected first boot. Sep 12 22:10:44.142676 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:10:44.142698 zram_generator::config[1083]: No configuration found. Sep 12 22:10:44.142711 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:10:44.142720 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:10:44.142735 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:10:44.142745 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:10:44.142755 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:10:44.142765 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:10:44.142775 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:10:44.142785 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:10:44.142796 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:10:44.142806 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:10:44.142817 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:10:44.142827 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:10:44.142837 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:10:44.142847 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:10:44.142857 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:10:44.142867 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:10:44.142878 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:10:44.142889 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:10:44.142899 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:10:44.142931 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:10:44.142943 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 22:10:44.142953 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:10:44.142963 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:10:44.142973 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:10:44.142983 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:10:44.142995 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:10:44.143005 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:10:44.143015 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:10:44.143025 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:10:44.143035 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:10:44.143044 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:10:44.143055 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:10:44.143064 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:10:44.143074 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:10:44.143086 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:10:44.143096 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:10:44.143107 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:10:44.143116 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:10:44.143126 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:10:44.143136 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:10:44.143146 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:10:44.143156 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:10:44.143166 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:10:44.143177 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:10:44.143188 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:10:44.143198 systemd[1]: Reached target machines.target - Containers. Sep 12 22:10:44.143208 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:10:44.143219 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:10:44.143229 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:10:44.143239 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:10:44.143249 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:10:44.143260 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:10:44.143270 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:10:44.143281 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:10:44.143290 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:10:44.143301 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:10:44.143312 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:10:44.143322 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:10:44.143332 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:10:44.143342 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:10:44.143354 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:10:44.143365 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:10:44.143375 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:10:44.143385 kernel: fuse: init (API version 7.41) Sep 12 22:10:44.143395 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:10:44.143405 kernel: loop: module loaded Sep 12 22:10:44.143414 kernel: ACPI: bus type drm_connector registered Sep 12 22:10:44.143426 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:10:44.143436 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:10:44.143447 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:10:44.143458 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:10:44.143468 systemd[1]: Stopped verity-setup.service. Sep 12 22:10:44.143478 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:10:44.143488 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:10:44.143500 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:10:44.143511 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:10:44.143522 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:10:44.143532 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:10:44.143542 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:10:44.143574 systemd-journald[1155]: Collecting audit messages is disabled. Sep 12 22:10:44.143600 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:10:44.143611 systemd-journald[1155]: Journal started Sep 12 22:10:44.143632 systemd-journald[1155]: Runtime Journal (/run/log/journal/b23b047d629d48999a6b6d45175643bc) is 6M, max 48.5M, 42.4M free. Sep 12 22:10:43.932183 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:10:43.954876 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 22:10:43.955242 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:10:44.146498 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:10:44.147303 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:10:44.147561 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:10:44.148869 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:10:44.149193 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:10:44.150387 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:10:44.150618 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:10:44.151903 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:10:44.152072 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:10:44.153336 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:10:44.153571 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:10:44.154809 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:10:44.155204 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:10:44.156455 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:10:44.157717 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:10:44.159320 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:10:44.160739 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:10:44.172279 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:10:44.174330 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:10:44.176137 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:10:44.177004 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:10:44.177034 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:10:44.178661 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:10:44.186713 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:10:44.187721 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:10:44.188772 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:10:44.190687 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:10:44.191852 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:10:44.194565 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:10:44.195891 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:10:44.199054 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:10:44.199511 systemd-journald[1155]: Time spent on flushing to /var/log/journal/b23b047d629d48999a6b6d45175643bc is 12.653ms for 886 entries. Sep 12 22:10:44.199511 systemd-journald[1155]: System Journal (/var/log/journal/b23b047d629d48999a6b6d45175643bc) is 8M, max 195.6M, 187.6M free. Sep 12 22:10:44.220549 systemd-journald[1155]: Received client request to flush runtime journal. Sep 12 22:10:44.202942 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:10:44.204957 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:10:44.207630 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:10:44.210248 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:10:44.211370 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:10:44.216003 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:10:44.219253 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:10:44.223142 kernel: loop0: detected capacity change from 0 to 100632 Sep 12 22:10:44.224086 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:10:44.225571 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:10:44.237012 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:10:44.247570 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:10:44.248094 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Sep 12 22:10:44.248114 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Sep 12 22:10:44.252231 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:10:44.256983 kernel: loop1: detected capacity change from 0 to 119368 Sep 12 22:10:44.257126 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:10:44.262660 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:10:44.279142 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:10:44.281948 kernel: loop2: detected capacity change from 0 to 207008 Sep 12 22:10:44.282754 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:10:44.300317 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 12 22:10:44.300339 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 12 22:10:44.303590 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:10:44.313006 kernel: loop3: detected capacity change from 0 to 100632 Sep 12 22:10:44.319959 kernel: loop4: detected capacity change from 0 to 119368 Sep 12 22:10:44.326949 kernel: loop5: detected capacity change from 0 to 207008 Sep 12 22:10:44.329581 (sd-merge)[1228]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 22:10:44.330011 (sd-merge)[1228]: Merged extensions into '/usr'. Sep 12 22:10:44.333654 systemd[1]: Reload requested from client PID 1199 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:10:44.333770 systemd[1]: Reloading... Sep 12 22:10:44.393943 zram_generator::config[1260]: No configuration found. Sep 12 22:10:44.466973 ldconfig[1194]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:10:44.531531 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:10:44.531843 systemd[1]: Reloading finished in 197 ms. Sep 12 22:10:44.550479 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:10:44.551673 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:10:44.563097 systemd[1]: Starting ensure-sysext.service... Sep 12 22:10:44.564675 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:10:44.573786 systemd[1]: Reload requested from client PID 1288 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:10:44.573804 systemd[1]: Reloading... Sep 12 22:10:44.577561 systemd-tmpfiles[1289]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:10:44.577599 systemd-tmpfiles[1289]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:10:44.577837 systemd-tmpfiles[1289]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:10:44.578042 systemd-tmpfiles[1289]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:10:44.578642 systemd-tmpfiles[1289]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:10:44.578856 systemd-tmpfiles[1289]: ACLs are not supported, ignoring. Sep 12 22:10:44.578904 systemd-tmpfiles[1289]: ACLs are not supported, ignoring. Sep 12 22:10:44.581712 systemd-tmpfiles[1289]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:10:44.581724 systemd-tmpfiles[1289]: Skipping /boot Sep 12 22:10:44.587597 systemd-tmpfiles[1289]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:10:44.587614 systemd-tmpfiles[1289]: Skipping /boot Sep 12 22:10:44.617951 zram_generator::config[1320]: No configuration found. Sep 12 22:10:44.744495 systemd[1]: Reloading finished in 170 ms. Sep 12 22:10:44.769394 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:10:44.775956 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:10:44.786025 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:10:44.788401 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:10:44.790378 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:10:44.794072 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:10:44.798145 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:10:44.800092 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:10:44.805585 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:10:44.814763 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:10:44.818253 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:10:44.821162 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:10:44.822473 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:10:44.822588 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:10:44.824891 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:10:44.830509 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:10:44.831119 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:10:44.834990 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:10:44.835166 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:10:44.836594 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:10:44.836757 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:10:44.838375 augenrules[1381]: No rules Sep 12 22:10:44.839829 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:10:44.840188 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:10:44.842891 systemd-udevd[1362]: Using default interface naming scheme 'v255'. Sep 12 22:10:44.843807 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:10:44.845167 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:10:44.847150 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:10:44.850173 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:10:44.850997 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:10:44.851106 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:10:44.861185 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:10:44.863771 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:10:44.865897 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:10:44.867775 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:10:44.870941 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:10:44.872337 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:10:44.872840 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:10:44.874628 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:10:44.874806 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:10:44.876739 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:10:44.876892 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:10:44.878174 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:10:44.896198 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:10:44.897499 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:10:44.898539 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:10:44.900727 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:10:44.917788 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:10:44.922592 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:10:44.923507 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:10:44.923625 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:10:44.927126 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:10:44.927900 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:10:44.929155 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:10:44.930366 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:10:44.930523 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:10:44.931824 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:10:44.931984 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:10:44.938664 systemd[1]: Finished ensure-sysext.service. Sep 12 22:10:44.943402 augenrules[1425]: /sbin/augenrules: No change Sep 12 22:10:44.944754 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:10:44.944927 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:10:44.949529 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 22:10:44.951408 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:10:44.952065 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:10:44.955785 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:10:44.955850 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:10:44.959449 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:10:44.962895 augenrules[1464]: No rules Sep 12 22:10:44.968344 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:10:44.971217 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:10:44.989822 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:10:45.004056 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:10:45.025960 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:10:45.059539 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:10:45.061041 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:10:45.063026 systemd-resolved[1355]: Positive Trust Anchors: Sep 12 22:10:45.063046 systemd-resolved[1355]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:10:45.063078 systemd-resolved[1355]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:10:45.066194 systemd-networkd[1442]: lo: Link UP Sep 12 22:10:45.066201 systemd-networkd[1442]: lo: Gained carrier Sep 12 22:10:45.066980 systemd-networkd[1442]: Enumeration completed Sep 12 22:10:45.067069 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:10:45.067371 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:10:45.067375 systemd-networkd[1442]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:10:45.068509 systemd-networkd[1442]: eth0: Link UP Sep 12 22:10:45.068617 systemd-networkd[1442]: eth0: Gained carrier Sep 12 22:10:45.068634 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:10:45.070049 systemd-resolved[1355]: Defaulting to hostname 'linux'. Sep 12 22:10:45.071051 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:10:45.073050 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:10:45.075045 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:10:45.076153 systemd[1]: Reached target network.target - Network. Sep 12 22:10:45.076829 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:10:45.077773 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:10:45.078902 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:10:45.079812 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:10:45.081152 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:10:45.083115 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:10:45.084257 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:10:45.085526 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:10:45.085556 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:10:45.086494 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:10:45.088100 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:10:45.089978 systemd-networkd[1442]: eth0: DHCPv4 address 10.0.0.61/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:10:45.091117 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:10:45.093520 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:10:45.094722 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:10:45.095763 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:10:45.098410 systemd-timesyncd[1465]: Network configuration changed, trying to establish connection. Sep 12 22:10:45.099249 systemd-timesyncd[1465]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 22:10:45.099384 systemd-timesyncd[1465]: Initial clock synchronization to Fri 2025-09-12 22:10:45.354889 UTC. Sep 12 22:10:45.099510 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:10:45.101063 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:10:45.102981 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:10:45.104551 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:10:45.106206 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:10:45.107418 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:10:45.108248 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:10:45.108278 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:10:45.111041 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:10:45.112736 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:10:45.115137 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:10:45.121044 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:10:45.123085 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:10:45.123864 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:10:45.125101 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:10:45.127451 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:10:45.129898 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:10:45.132283 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:10:45.135470 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:10:45.137357 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:10:45.137775 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:10:45.143564 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:10:45.145755 jq[1502]: false Sep 12 22:10:45.147816 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:10:45.150028 extend-filesystems[1503]: Found /dev/vda6 Sep 12 22:10:45.151316 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:10:45.152813 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:10:45.153026 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:10:45.156311 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:10:45.156490 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:10:45.158938 jq[1513]: true Sep 12 22:10:45.162961 extend-filesystems[1503]: Found /dev/vda9 Sep 12 22:10:45.166952 extend-filesystems[1503]: Checking size of /dev/vda9 Sep 12 22:10:45.175845 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:10:45.183929 update_engine[1512]: I20250912 22:10:45.183132 1512 main.cc:92] Flatcar Update Engine starting Sep 12 22:10:45.184152 tar[1516]: linux-arm64/LICENSE Sep 12 22:10:45.184283 tar[1516]: linux-arm64/helm Sep 12 22:10:45.185151 extend-filesystems[1503]: Resized partition /dev/vda9 Sep 12 22:10:45.189212 extend-filesystems[1540]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:10:45.192840 jq[1525]: true Sep 12 22:10:45.196416 dbus-daemon[1497]: [system] SELinux support is enabled Sep 12 22:10:45.197067 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:10:45.198192 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 22:10:45.202113 update_engine[1512]: I20250912 22:10:45.202055 1512 update_check_scheduler.cc:74] Next update check in 7m57s Sep 12 22:10:45.202695 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:10:45.202726 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:10:45.204467 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:10:45.204493 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:10:45.207281 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:10:45.214044 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:10:45.221763 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:10:45.222036 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:10:45.224246 (ntainerd)[1538]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:10:45.224549 systemd-logind[1508]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 22:10:45.224906 systemd-logind[1508]: New seat seat0. Sep 12 22:10:45.230894 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:10:45.241957 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 22:10:45.250396 extend-filesystems[1540]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 22:10:45.250396 extend-filesystems[1540]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 22:10:45.250396 extend-filesystems[1540]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 22:10:45.257121 extend-filesystems[1503]: Resized filesystem in /dev/vda9 Sep 12 22:10:45.255221 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:10:45.257486 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:10:45.260961 bash[1564]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:10:45.288616 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:10:45.291671 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:10:45.311775 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 22:10:45.330155 locksmithd[1544]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:10:45.394819 containerd[1538]: time="2025-09-12T22:10:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:10:45.398259 containerd[1538]: time="2025-09-12T22:10:45.398216640Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:10:45.412043 containerd[1538]: time="2025-09-12T22:10:45.411990960Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.44µs" Sep 12 22:10:45.412110 containerd[1538]: time="2025-09-12T22:10:45.412036720Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:10:45.412152 containerd[1538]: time="2025-09-12T22:10:45.412110720Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:10:45.412351 containerd[1538]: time="2025-09-12T22:10:45.412324880Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:10:45.412410 containerd[1538]: time="2025-09-12T22:10:45.412351560Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:10:45.412443 containerd[1538]: time="2025-09-12T22:10:45.412429000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:10:45.412554 containerd[1538]: time="2025-09-12T22:10:45.412531200Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:10:45.412554 containerd[1538]: time="2025-09-12T22:10:45.412552360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:10:45.412976 containerd[1538]: time="2025-09-12T22:10:45.412947520Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413003 containerd[1538]: time="2025-09-12T22:10:45.412975920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413003 containerd[1538]: time="2025-09-12T22:10:45.412989080Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413003 containerd[1538]: time="2025-09-12T22:10:45.412996960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413357 containerd[1538]: time="2025-09-12T22:10:45.413289840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413683 containerd[1538]: time="2025-09-12T22:10:45.413653040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413719 containerd[1538]: time="2025-09-12T22:10:45.413702800Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:10:45.413780 containerd[1538]: time="2025-09-12T22:10:45.413762480Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:10:45.413831 containerd[1538]: time="2025-09-12T22:10:45.413815880Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:10:45.414314 containerd[1538]: time="2025-09-12T22:10:45.414284840Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:10:45.414482 containerd[1538]: time="2025-09-12T22:10:45.414417280Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:10:45.423496 containerd[1538]: time="2025-09-12T22:10:45.423461800Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:10:45.423624 containerd[1538]: time="2025-09-12T22:10:45.423602400Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:10:45.423788 containerd[1538]: time="2025-09-12T22:10:45.423767120Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:10:45.424172 containerd[1538]: time="2025-09-12T22:10:45.424145080Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:10:45.424204 containerd[1538]: time="2025-09-12T22:10:45.424189040Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:10:45.424249 containerd[1538]: time="2025-09-12T22:10:45.424231720Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:10:45.424272 containerd[1538]: time="2025-09-12T22:10:45.424254520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:10:45.424300 containerd[1538]: time="2025-09-12T22:10:45.424272560Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:10:45.424300 containerd[1538]: time="2025-09-12T22:10:45.424290000Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:10:45.424333 containerd[1538]: time="2025-09-12T22:10:45.424302120Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:10:45.424333 containerd[1538]: time="2025-09-12T22:10:45.424316160Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:10:45.424364 containerd[1538]: time="2025-09-12T22:10:45.424332680Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.425901600Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426141080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426159440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426171160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426181840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426192760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426204400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426215120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426226920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426237720Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426248320Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426429520Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426443400Z" level=info msg="Start snapshots syncer" Sep 12 22:10:45.426919 containerd[1538]: time="2025-09-12T22:10:45.426469560Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:10:45.427147 containerd[1538]: time="2025-09-12T22:10:45.426668880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:10:45.427147 containerd[1538]: time="2025-09-12T22:10:45.426723840Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426796520Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426944440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426968120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426977880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426987600Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.426999320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427011320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427022240Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427046160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427057600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427067880Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427096800Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427109760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:10:45.427244 containerd[1538]: time="2025-09-12T22:10:45.427118000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427128800Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427137560Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427147160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427157440Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427233120Z" level=info msg="runtime interface created" Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427239840Z" level=info msg="created NRI interface" Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427248680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427259480Z" level=info msg="Connect containerd service" Sep 12 22:10:45.427458 containerd[1538]: time="2025-09-12T22:10:45.427283720Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:10:45.428069 containerd[1538]: time="2025-09-12T22:10:45.428042440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:10:45.495514 containerd[1538]: time="2025-09-12T22:10:45.495412360Z" level=info msg="Start subscribing containerd event" Sep 12 22:10:45.495652 containerd[1538]: time="2025-09-12T22:10:45.495602600Z" level=info msg="Start recovering state" Sep 12 22:10:45.495778 containerd[1538]: time="2025-09-12T22:10:45.495751640Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:10:45.495824 containerd[1538]: time="2025-09-12T22:10:45.495806400Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:10:45.495845 containerd[1538]: time="2025-09-12T22:10:45.495821000Z" level=info msg="Start event monitor" Sep 12 22:10:45.495845 containerd[1538]: time="2025-09-12T22:10:45.495839640Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:10:45.495878 containerd[1538]: time="2025-09-12T22:10:45.495847000Z" level=info msg="Start streaming server" Sep 12 22:10:45.495878 containerd[1538]: time="2025-09-12T22:10:45.495860200Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:10:45.495909 containerd[1538]: time="2025-09-12T22:10:45.495878960Z" level=info msg="runtime interface starting up..." Sep 12 22:10:45.495909 containerd[1538]: time="2025-09-12T22:10:45.495887040Z" level=info msg="starting plugins..." Sep 12 22:10:45.495909 containerd[1538]: time="2025-09-12T22:10:45.495901360Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:10:45.496924 containerd[1538]: time="2025-09-12T22:10:45.496180920Z" level=info msg="containerd successfully booted in 0.101713s" Sep 12 22:10:45.496288 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:10:45.503272 tar[1516]: linux-arm64/README.md Sep 12 22:10:45.521315 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:10:45.620226 sshd_keygen[1546]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:10:45.639961 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:10:45.642414 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:10:45.660367 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:10:45.660612 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:10:45.663015 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:10:45.695188 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:10:45.697584 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:10:45.699495 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 22:10:45.700577 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:10:46.111952 systemd-networkd[1442]: eth0: Gained IPv6LL Sep 12 22:10:46.115069 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:10:46.116506 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:10:46.119190 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 22:10:46.122203 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:10:46.138990 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:10:46.163143 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:10:46.164651 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 22:10:46.164855 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 22:10:46.166818 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:10:46.692182 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:10:46.693480 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:10:46.697276 (kubelet)[1640]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:10:46.698793 systemd[1]: Startup finished in 2.009s (kernel) + 4.970s (initrd) + 3.176s (userspace) = 10.156s. Sep 12 22:10:47.056347 kubelet[1640]: E0912 22:10:47.056231 1640 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:10:47.058902 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:10:47.059067 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:10:47.059389 systemd[1]: kubelet.service: Consumed 742ms CPU time, 256.8M memory peak. Sep 12 22:10:51.189060 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:10:51.191544 systemd[1]: Started sshd@0-10.0.0.61:22-10.0.0.1:54128.service - OpenSSH per-connection server daemon (10.0.0.1:54128). Sep 12 22:10:51.302975 sshd[1653]: Accepted publickey for core from 10.0.0.1 port 54128 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:51.304710 sshd-session[1653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:51.313547 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:10:51.316092 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:10:51.324370 systemd-logind[1508]: New session 1 of user core. Sep 12 22:10:51.336767 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:10:51.343326 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:10:51.371234 (systemd)[1658]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:10:51.373989 systemd-logind[1508]: New session c1 of user core. Sep 12 22:10:51.487616 systemd[1658]: Queued start job for default target default.target. Sep 12 22:10:51.493905 systemd[1658]: Created slice app.slice - User Application Slice. Sep 12 22:10:51.493966 systemd[1658]: Reached target paths.target - Paths. Sep 12 22:10:51.494008 systemd[1658]: Reached target timers.target - Timers. Sep 12 22:10:51.495309 systemd[1658]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:10:51.506051 systemd[1658]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:10:51.506373 systemd[1658]: Reached target sockets.target - Sockets. Sep 12 22:10:51.506510 systemd[1658]: Reached target basic.target - Basic System. Sep 12 22:10:51.506615 systemd[1658]: Reached target default.target - Main User Target. Sep 12 22:10:51.506631 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:10:51.506758 systemd[1658]: Startup finished in 125ms. Sep 12 22:10:51.513595 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:10:51.587175 systemd[1]: Started sshd@1-10.0.0.61:22-10.0.0.1:54144.service - OpenSSH per-connection server daemon (10.0.0.1:54144). Sep 12 22:10:51.655299 sshd[1669]: Accepted publickey for core from 10.0.0.1 port 54144 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:51.657035 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:51.661769 systemd-logind[1508]: New session 2 of user core. Sep 12 22:10:51.682126 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:10:51.736571 sshd[1672]: Connection closed by 10.0.0.1 port 54144 Sep 12 22:10:51.736968 sshd-session[1669]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:51.748204 systemd[1]: sshd@1-10.0.0.61:22-10.0.0.1:54144.service: Deactivated successfully. Sep 12 22:10:51.749991 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 22:10:51.752176 systemd-logind[1508]: Session 2 logged out. Waiting for processes to exit. Sep 12 22:10:51.754572 systemd[1]: Started sshd@2-10.0.0.61:22-10.0.0.1:54146.service - OpenSSH per-connection server daemon (10.0.0.1:54146). Sep 12 22:10:51.755252 systemd-logind[1508]: Removed session 2. Sep 12 22:10:51.806857 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 54146 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:51.808133 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:51.812847 systemd-logind[1508]: New session 3 of user core. Sep 12 22:10:51.822072 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:10:51.872398 sshd[1681]: Connection closed by 10.0.0.1 port 54146 Sep 12 22:10:51.873059 sshd-session[1678]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:51.881788 systemd[1]: sshd@2-10.0.0.61:22-10.0.0.1:54146.service: Deactivated successfully. Sep 12 22:10:51.883173 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 22:10:51.884455 systemd-logind[1508]: Session 3 logged out. Waiting for processes to exit. Sep 12 22:10:51.886555 systemd[1]: Started sshd@3-10.0.0.61:22-10.0.0.1:54156.service - OpenSSH per-connection server daemon (10.0.0.1:54156). Sep 12 22:10:51.887037 systemd-logind[1508]: Removed session 3. Sep 12 22:10:51.946390 sshd[1687]: Accepted publickey for core from 10.0.0.1 port 54156 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:51.947691 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:51.952272 systemd-logind[1508]: New session 4 of user core. Sep 12 22:10:51.960186 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:10:52.014045 sshd[1690]: Connection closed by 10.0.0.1 port 54156 Sep 12 22:10:52.014378 sshd-session[1687]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:52.025013 systemd[1]: sshd@3-10.0.0.61:22-10.0.0.1:54156.service: Deactivated successfully. Sep 12 22:10:52.027434 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:10:52.028393 systemd-logind[1508]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:10:52.031308 systemd[1]: Started sshd@4-10.0.0.61:22-10.0.0.1:54170.service - OpenSSH per-connection server daemon (10.0.0.1:54170). Sep 12 22:10:52.032245 systemd-logind[1508]: Removed session 4. Sep 12 22:10:52.091753 sshd[1696]: Accepted publickey for core from 10.0.0.1 port 54170 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:52.093460 sshd-session[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:52.113353 systemd-logind[1508]: New session 5 of user core. Sep 12 22:10:52.121110 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:10:52.178979 sudo[1700]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:10:52.179562 sudo[1700]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:10:52.196750 sudo[1700]: pam_unix(sudo:session): session closed for user root Sep 12 22:10:52.198415 sshd[1699]: Connection closed by 10.0.0.1 port 54170 Sep 12 22:10:52.198823 sshd-session[1696]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:52.213184 systemd[1]: sshd@4-10.0.0.61:22-10.0.0.1:54170.service: Deactivated successfully. Sep 12 22:10:52.214707 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:10:52.216106 systemd-logind[1508]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:10:52.218022 systemd[1]: Started sshd@5-10.0.0.61:22-10.0.0.1:54174.service - OpenSSH per-connection server daemon (10.0.0.1:54174). Sep 12 22:10:52.219157 systemd-logind[1508]: Removed session 5. Sep 12 22:10:52.278376 sshd[1706]: Accepted publickey for core from 10.0.0.1 port 54174 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:52.279580 sshd-session[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:52.283419 systemd-logind[1508]: New session 6 of user core. Sep 12 22:10:52.291088 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:10:52.342331 sudo[1711]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:10:52.342604 sudo[1711]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:10:52.418353 sudo[1711]: pam_unix(sudo:session): session closed for user root Sep 12 22:10:52.424405 sudo[1710]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:10:52.425014 sudo[1710]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:10:52.434287 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:10:52.478262 augenrules[1733]: No rules Sep 12 22:10:52.479633 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:10:52.479889 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:10:52.480793 sudo[1710]: pam_unix(sudo:session): session closed for user root Sep 12 22:10:52.482066 sshd[1709]: Connection closed by 10.0.0.1 port 54174 Sep 12 22:10:52.482491 sshd-session[1706]: pam_unix(sshd:session): session closed for user core Sep 12 22:10:52.494788 systemd[1]: sshd@5-10.0.0.61:22-10.0.0.1:54174.service: Deactivated successfully. Sep 12 22:10:52.497096 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:10:52.497871 systemd-logind[1508]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:10:52.501218 systemd[1]: Started sshd@6-10.0.0.61:22-10.0.0.1:54190.service - OpenSSH per-connection server daemon (10.0.0.1:54190). Sep 12 22:10:52.502252 systemd-logind[1508]: Removed session 6. Sep 12 22:10:52.559332 sshd[1742]: Accepted publickey for core from 10.0.0.1 port 54190 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:10:52.560870 sshd-session[1742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:10:52.564994 systemd-logind[1508]: New session 7 of user core. Sep 12 22:10:52.577150 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:10:52.629413 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:10:52.629690 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:10:52.919595 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:10:52.937304 (dockerd)[1766]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:10:53.140714 dockerd[1766]: time="2025-09-12T22:10:53.140650465Z" level=info msg="Starting up" Sep 12 22:10:53.142022 dockerd[1766]: time="2025-09-12T22:10:53.141915831Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:10:53.152074 dockerd[1766]: time="2025-09-12T22:10:53.152041996Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:10:53.183264 dockerd[1766]: time="2025-09-12T22:10:53.183151756Z" level=info msg="Loading containers: start." Sep 12 22:10:53.190956 kernel: Initializing XFRM netlink socket Sep 12 22:10:53.399026 systemd-networkd[1442]: docker0: Link UP Sep 12 22:10:53.403069 dockerd[1766]: time="2025-09-12T22:10:53.403013546Z" level=info msg="Loading containers: done." Sep 12 22:10:53.416725 dockerd[1766]: time="2025-09-12T22:10:53.416669143Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:10:53.416871 dockerd[1766]: time="2025-09-12T22:10:53.416755088Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:10:53.416871 dockerd[1766]: time="2025-09-12T22:10:53.416832811Z" level=info msg="Initializing buildkit" Sep 12 22:10:53.439124 dockerd[1766]: time="2025-09-12T22:10:53.438843741Z" level=info msg="Completed buildkit initialization" Sep 12 22:10:53.443766 dockerd[1766]: time="2025-09-12T22:10:53.443640806Z" level=info msg="Daemon has completed initialization" Sep 12 22:10:53.444260 dockerd[1766]: time="2025-09-12T22:10:53.443756035Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:10:53.443914 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:10:54.175695 containerd[1538]: time="2025-09-12T22:10:54.175653089Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 22:10:54.814610 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1612873097.mount: Deactivated successfully. Sep 12 22:10:55.735334 containerd[1538]: time="2025-09-12T22:10:55.735285291Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:55.736245 containerd[1538]: time="2025-09-12T22:10:55.736204601Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=26363687" Sep 12 22:10:55.736956 containerd[1538]: time="2025-09-12T22:10:55.736923484Z" level=info msg="ImageCreate event name:\"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:55.741229 containerd[1538]: time="2025-09-12T22:10:55.741176324Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:55.742514 containerd[1538]: time="2025-09-12T22:10:55.742469998Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"26360284\" in 1.566773043s" Sep 12 22:10:55.742514 containerd[1538]: time="2025-09-12T22:10:55.742510827Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\"" Sep 12 22:10:55.743451 containerd[1538]: time="2025-09-12T22:10:55.743424604Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 22:10:56.805953 containerd[1538]: time="2025-09-12T22:10:56.805895012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:56.806766 containerd[1538]: time="2025-09-12T22:10:56.806729612Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=22531202" Sep 12 22:10:56.807884 containerd[1538]: time="2025-09-12T22:10:56.807438483Z" level=info msg="ImageCreate event name:\"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:56.810370 containerd[1538]: time="2025-09-12T22:10:56.810344273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:56.811715 containerd[1538]: time="2025-09-12T22:10:56.811679285Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"24099975\" in 1.068224482s" Sep 12 22:10:56.811715 containerd[1538]: time="2025-09-12T22:10:56.811712119Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\"" Sep 12 22:10:56.812188 containerd[1538]: time="2025-09-12T22:10:56.812171308Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 22:10:57.309492 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:10:57.310896 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:10:57.428639 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:10:57.432079 (kubelet)[2053]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:10:57.474252 kubelet[2053]: E0912 22:10:57.474204 2053 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:10:57.477437 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:10:57.477677 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:10:57.479038 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.4M memory peak. Sep 12 22:10:57.943960 containerd[1538]: time="2025-09-12T22:10:57.943819578Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:57.944854 containerd[1538]: time="2025-09-12T22:10:57.944686148Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=17484326" Sep 12 22:10:57.945686 containerd[1538]: time="2025-09-12T22:10:57.945655266Z" level=info msg="ImageCreate event name:\"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:57.948739 containerd[1538]: time="2025-09-12T22:10:57.948710944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:57.949660 containerd[1538]: time="2025-09-12T22:10:57.949619822Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"19053117\" in 1.137339656s" Sep 12 22:10:57.949660 containerd[1538]: time="2025-09-12T22:10:57.949653791Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\"" Sep 12 22:10:57.950324 containerd[1538]: time="2025-09-12T22:10:57.950300232Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 22:10:58.891339 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2291842482.mount: Deactivated successfully. Sep 12 22:10:59.251985 containerd[1538]: time="2025-09-12T22:10:59.251312673Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:59.252273 containerd[1538]: time="2025-09-12T22:10:59.252031419Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=27417819" Sep 12 22:10:59.252717 containerd[1538]: time="2025-09-12T22:10:59.252688500Z" level=info msg="ImageCreate event name:\"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:59.254481 containerd[1538]: time="2025-09-12T22:10:59.254456041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:10:59.254999 containerd[1538]: time="2025-09-12T22:10:59.254974304Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"27416836\" in 1.304641084s" Sep 12 22:10:59.255165 containerd[1538]: time="2025-09-12T22:10:59.255074747Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\"" Sep 12 22:10:59.255639 containerd[1538]: time="2025-09-12T22:10:59.255620202Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 22:10:59.735548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount370400936.mount: Deactivated successfully. Sep 12 22:11:00.357774 containerd[1538]: time="2025-09-12T22:11:00.357716659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:00.359128 containerd[1538]: time="2025-09-12T22:11:00.358838682Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 12 22:11:00.359984 containerd[1538]: time="2025-09-12T22:11:00.359959298Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:00.362996 containerd[1538]: time="2025-09-12T22:11:00.362963755Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:00.364878 containerd[1538]: time="2025-09-12T22:11:00.364847796Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.10912484s" Sep 12 22:11:00.365006 containerd[1538]: time="2025-09-12T22:11:00.364988847Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 22:11:00.365501 containerd[1538]: time="2025-09-12T22:11:00.365478770Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:11:00.774812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1939595034.mount: Deactivated successfully. Sep 12 22:11:00.781048 containerd[1538]: time="2025-09-12T22:11:00.781000271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:11:00.781534 containerd[1538]: time="2025-09-12T22:11:00.781492405Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 12 22:11:00.782402 containerd[1538]: time="2025-09-12T22:11:00.782379074Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:11:00.784486 containerd[1538]: time="2025-09-12T22:11:00.784263677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:11:00.784907 containerd[1538]: time="2025-09-12T22:11:00.784876845Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 419.367244ms" Sep 12 22:11:00.784983 containerd[1538]: time="2025-09-12T22:11:00.784906470Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 22:11:00.785505 containerd[1538]: time="2025-09-12T22:11:00.785481611Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 22:11:01.283107 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount342735217.mount: Deactivated successfully. Sep 12 22:11:02.947764 containerd[1538]: time="2025-09-12T22:11:02.947716671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:02.948862 containerd[1538]: time="2025-09-12T22:11:02.948804681Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943167" Sep 12 22:11:02.949717 containerd[1538]: time="2025-09-12T22:11:02.949681819Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:02.953383 containerd[1538]: time="2025-09-12T22:11:02.953339691Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:02.955308 containerd[1538]: time="2025-09-12T22:11:02.955268663Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.169693121s" Sep 12 22:11:02.955308 containerd[1538]: time="2025-09-12T22:11:02.955306365Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 12 22:11:07.728134 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:11:07.729567 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:07.875790 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:07.894252 (kubelet)[2213]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:11:07.943712 kubelet[2213]: E0912 22:11:07.943643 2213 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:11:07.946020 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:11:07.946174 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:11:07.946513 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.4M memory peak. Sep 12 22:11:09.232529 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:09.232676 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.4M memory peak. Sep 12 22:11:09.234619 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:09.257842 systemd[1]: Reload requested from client PID 2230 ('systemctl') (unit session-7.scope)... Sep 12 22:11:09.257861 systemd[1]: Reloading... Sep 12 22:11:09.322942 zram_generator::config[2273]: No configuration found. Sep 12 22:11:09.484335 systemd[1]: Reloading finished in 226 ms. Sep 12 22:11:09.537451 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 22:11:09.537535 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 22:11:09.537826 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:09.537874 systemd[1]: kubelet.service: Consumed 95ms CPU time, 94.9M memory peak. Sep 12 22:11:09.539618 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:09.663091 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:09.677351 (kubelet)[2318]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:11:09.714417 kubelet[2318]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:11:09.714417 kubelet[2318]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 22:11:09.714417 kubelet[2318]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:11:09.714771 kubelet[2318]: I0912 22:11:09.714516 2318 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:11:11.240746 kubelet[2318]: I0912 22:11:11.240701 2318 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 22:11:11.241171 kubelet[2318]: I0912 22:11:11.241154 2318 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:11:11.241705 kubelet[2318]: I0912 22:11:11.241683 2318 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 22:11:11.268119 kubelet[2318]: E0912 22:11:11.268062 2318 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.61:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.61:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:11:11.269191 kubelet[2318]: I0912 22:11:11.269041 2318 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:11:11.276319 kubelet[2318]: I0912 22:11:11.276294 2318 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:11:11.279359 kubelet[2318]: I0912 22:11:11.279326 2318 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:11:11.280850 kubelet[2318]: I0912 22:11:11.280155 2318 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:11:11.280850 kubelet[2318]: I0912 22:11:11.280209 2318 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:11:11.280850 kubelet[2318]: I0912 22:11:11.280451 2318 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:11:11.280850 kubelet[2318]: I0912 22:11:11.280462 2318 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 22:11:11.281250 kubelet[2318]: I0912 22:11:11.280658 2318 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:11:11.283471 kubelet[2318]: I0912 22:11:11.283446 2318 kubelet.go:446] "Attempting to sync node with API server" Sep 12 22:11:11.283576 kubelet[2318]: I0912 22:11:11.283564 2318 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:11:11.283694 kubelet[2318]: I0912 22:11:11.283670 2318 kubelet.go:352] "Adding apiserver pod source" Sep 12 22:11:11.283735 kubelet[2318]: I0912 22:11:11.283697 2318 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:11:11.285196 kubelet[2318]: W0912 22:11:11.285147 2318 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.61:6443: connect: connection refused Sep 12 22:11:11.285235 kubelet[2318]: E0912 22:11:11.285209 2318 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.61:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:11:11.285294 kubelet[2318]: W0912 22:11:11.285271 2318 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.61:6443: connect: connection refused Sep 12 22:11:11.285328 kubelet[2318]: E0912 22:11:11.285299 2318 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.61:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:11:11.286755 kubelet[2318]: I0912 22:11:11.286714 2318 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:11:11.287449 kubelet[2318]: I0912 22:11:11.287423 2318 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:11:11.287568 kubelet[2318]: W0912 22:11:11.287555 2318 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:11:11.288484 kubelet[2318]: I0912 22:11:11.288459 2318 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 22:11:11.288522 kubelet[2318]: I0912 22:11:11.288504 2318 server.go:1287] "Started kubelet" Sep 12 22:11:11.290283 kubelet[2318]: I0912 22:11:11.290231 2318 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:11:11.291935 kubelet[2318]: I0912 22:11:11.291844 2318 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:11:11.292611 kubelet[2318]: I0912 22:11:11.292570 2318 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:11:11.293236 kubelet[2318]: I0912 22:11:11.293215 2318 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 22:11:11.294339 kubelet[2318]: I0912 22:11:11.293996 2318 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 22:11:11.294339 kubelet[2318]: I0912 22:11:11.294057 2318 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:11:11.294434 kubelet[2318]: E0912 22:11:11.294356 2318 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:11:11.295683 kubelet[2318]: W0912 22:11:11.295604 2318 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.61:6443: connect: connection refused Sep 12 22:11:11.295683 kubelet[2318]: E0912 22:11:11.295665 2318 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.61:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:11:11.296166 kubelet[2318]: E0912 22:11:11.295883 2318 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.61:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.61:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864a88b688016d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 22:11:11.288481493 +0000 UTC m=+1.607830239,LastTimestamp:2025-09-12 22:11:11.288481493 +0000 UTC m=+1.607830239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 22:11:11.297127 kubelet[2318]: E0912 22:11:11.296264 2318 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.61:6443: connect: connection refused" interval="200ms" Sep 12 22:11:11.297127 kubelet[2318]: I0912 22:11:11.296122 2318 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:11:11.297127 kubelet[2318]: I0912 22:11:11.296586 2318 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:11:11.297127 kubelet[2318]: I0912 22:11:11.296619 2318 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:11:11.297127 kubelet[2318]: I0912 22:11:11.296678 2318 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:11:11.297127 kubelet[2318]: I0912 22:11:11.297064 2318 server.go:479] "Adding debug handlers to kubelet server" Sep 12 22:11:11.298367 kubelet[2318]: I0912 22:11:11.298344 2318 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:11:11.299330 kubelet[2318]: E0912 22:11:11.299296 2318 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:11:11.309157 kubelet[2318]: I0912 22:11:11.309051 2318 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:11:11.310609 kubelet[2318]: I0912 22:11:11.310578 2318 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:11:11.310609 kubelet[2318]: I0912 22:11:11.310610 2318 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 22:11:11.310697 kubelet[2318]: I0912 22:11:11.310634 2318 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 22:11:11.310697 kubelet[2318]: I0912 22:11:11.310641 2318 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 22:11:11.310697 kubelet[2318]: E0912 22:11:11.310687 2318 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:11:11.313194 kubelet[2318]: I0912 22:11:11.313161 2318 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 22:11:11.313194 kubelet[2318]: I0912 22:11:11.313187 2318 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 22:11:11.313311 kubelet[2318]: I0912 22:11:11.313210 2318 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:11:11.395016 kubelet[2318]: E0912 22:11:11.394912 2318 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:11:11.411210 kubelet[2318]: E0912 22:11:11.411176 2318 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 22:11:11.413491 kubelet[2318]: W0912 22:11:11.413424 2318 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.61:6443: connect: connection refused Sep 12 22:11:11.413491 kubelet[2318]: E0912 22:11:11.413485 2318 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.61:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:11:11.413581 kubelet[2318]: I0912 22:11:11.413568 2318 policy_none.go:49] "None policy: Start" Sep 12 22:11:11.413605 kubelet[2318]: I0912 22:11:11.413599 2318 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 22:11:11.413623 kubelet[2318]: I0912 22:11:11.413613 2318 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:11:11.419099 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:11:11.431857 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:11:11.435346 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:11:11.455852 kubelet[2318]: I0912 22:11:11.455804 2318 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:11:11.456078 kubelet[2318]: I0912 22:11:11.456057 2318 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:11:11.456115 kubelet[2318]: I0912 22:11:11.456077 2318 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:11:11.456376 kubelet[2318]: I0912 22:11:11.456355 2318 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:11:11.457172 kubelet[2318]: E0912 22:11:11.457148 2318 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 22:11:11.457254 kubelet[2318]: E0912 22:11:11.457204 2318 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 22:11:11.497185 kubelet[2318]: E0912 22:11:11.497066 2318 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.61:6443: connect: connection refused" interval="400ms" Sep 12 22:11:11.558268 kubelet[2318]: I0912 22:11:11.558171 2318 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:11:11.558681 kubelet[2318]: E0912 22:11:11.558645 2318 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.61:6443/api/v1/nodes\": dial tcp 10.0.0.61:6443: connect: connection refused" node="localhost" Sep 12 22:11:11.619020 systemd[1]: Created slice kubepods-burstable-pod4e2fffe60e4b8ad4081aa4b1235f3c8c.slice - libcontainer container kubepods-burstable-pod4e2fffe60e4b8ad4081aa4b1235f3c8c.slice. Sep 12 22:11:11.632877 kubelet[2318]: E0912 22:11:11.632832 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:11.636082 systemd[1]: Created slice kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice - libcontainer container kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice. Sep 12 22:11:11.637975 kubelet[2318]: E0912 22:11:11.637891 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:11.648685 systemd[1]: Created slice kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice - libcontainer container kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice. Sep 12 22:11:11.650539 kubelet[2318]: E0912 22:11:11.650510 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:11.696740 kubelet[2318]: I0912 22:11:11.696584 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:11.696740 kubelet[2318]: I0912 22:11:11.696630 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:11.696740 kubelet[2318]: I0912 22:11:11.696650 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:11.696740 kubelet[2318]: I0912 22:11:11.696673 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:11.696740 kubelet[2318]: I0912 22:11:11.696690 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:11.696995 kubelet[2318]: I0912 22:11:11.696950 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:11.696995 kubelet[2318]: I0912 22:11:11.696971 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:11.696995 kubelet[2318]: I0912 22:11:11.696987 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:11.697053 kubelet[2318]: I0912 22:11:11.697003 2318 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:11.760823 kubelet[2318]: I0912 22:11:11.760712 2318 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:11:11.761146 kubelet[2318]: E0912 22:11:11.761109 2318 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.61:6443/api/v1/nodes\": dial tcp 10.0.0.61:6443: connect: connection refused" node="localhost" Sep 12 22:11:11.897708 kubelet[2318]: E0912 22:11:11.897658 2318 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.61:6443: connect: connection refused" interval="800ms" Sep 12 22:11:11.934643 containerd[1538]: time="2025-09-12T22:11:11.934602238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:4e2fffe60e4b8ad4081aa4b1235f3c8c,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:11.939534 containerd[1538]: time="2025-09-12T22:11:11.939442884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:11.952664 containerd[1538]: time="2025-09-12T22:11:11.952441216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:11.981398 containerd[1538]: time="2025-09-12T22:11:11.981339887Z" level=info msg="connecting to shim 3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8" address="unix:///run/containerd/s/b14752036548c8c3583cc732b99f98b523d864d0bfe33926975402ef188f1df0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:11.985445 containerd[1538]: time="2025-09-12T22:11:11.985388155Z" level=info msg="connecting to shim ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0" address="unix:///run/containerd/s/fae323c855657f42c9a14d13cd777c3e3f9d55dca5ea6063b11e3af20860fb18" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:11.999947 containerd[1538]: time="2025-09-12T22:11:11.999089804Z" level=info msg="connecting to shim 3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765" address="unix:///run/containerd/s/98b689256971bbbaf196b119d5cf063e8221f6006b18d20ab7426d1ba4074d99" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:12.017133 systemd[1]: Started cri-containerd-3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8.scope - libcontainer container 3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8. Sep 12 22:11:12.018325 systemd[1]: Started cri-containerd-ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0.scope - libcontainer container ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0. Sep 12 22:11:12.031124 systemd[1]: Started cri-containerd-3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765.scope - libcontainer container 3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765. Sep 12 22:11:12.070323 containerd[1538]: time="2025-09-12T22:11:12.069886038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0\"" Sep 12 22:11:12.075031 containerd[1538]: time="2025-09-12T22:11:12.074962793Z" level=info msg="CreateContainer within sandbox \"ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:11:12.087338 containerd[1538]: time="2025-09-12T22:11:12.087283092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,} returns sandbox id \"3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765\"" Sep 12 22:11:12.090735 containerd[1538]: time="2025-09-12T22:11:12.090688110Z" level=info msg="CreateContainer within sandbox \"3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:11:12.091614 containerd[1538]: time="2025-09-12T22:11:12.091582156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:4e2fffe60e4b8ad4081aa4b1235f3c8c,Namespace:kube-system,Attempt:0,} returns sandbox id \"3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8\"" Sep 12 22:11:12.094588 containerd[1538]: time="2025-09-12T22:11:12.094541011Z" level=info msg="Container 3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:12.094932 containerd[1538]: time="2025-09-12T22:11:12.094882269Z" level=info msg="CreateContainer within sandbox \"3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:11:12.102074 containerd[1538]: time="2025-09-12T22:11:12.102023712Z" level=info msg="Container 84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:12.105756 containerd[1538]: time="2025-09-12T22:11:12.105691550Z" level=info msg="CreateContainer within sandbox \"ab7826162a0fbf17c418a07a74397db43c996e8f84979dacf6005c4ea45084a0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654\"" Sep 12 22:11:12.106565 containerd[1538]: time="2025-09-12T22:11:12.106528580Z" level=info msg="StartContainer for \"3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654\"" Sep 12 22:11:12.107771 containerd[1538]: time="2025-09-12T22:11:12.107729531Z" level=info msg="connecting to shim 3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654" address="unix:///run/containerd/s/fae323c855657f42c9a14d13cd777c3e3f9d55dca5ea6063b11e3af20860fb18" protocol=ttrpc version=3 Sep 12 22:11:12.110472 containerd[1538]: time="2025-09-12T22:11:12.110427287Z" level=info msg="CreateContainer within sandbox \"3a7898a1348edd76d7e3b3efe19749af0bd84f4890f68200d5e62e9e6cecd765\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca\"" Sep 12 22:11:12.111340 containerd[1538]: time="2025-09-12T22:11:12.111254788Z" level=info msg="Container bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:12.111458 containerd[1538]: time="2025-09-12T22:11:12.111280733Z" level=info msg="StartContainer for \"84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca\"" Sep 12 22:11:12.112570 containerd[1538]: time="2025-09-12T22:11:12.112538261Z" level=info msg="connecting to shim 84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca" address="unix:///run/containerd/s/98b689256971bbbaf196b119d5cf063e8221f6006b18d20ab7426d1ba4074d99" protocol=ttrpc version=3 Sep 12 22:11:12.121849 containerd[1538]: time="2025-09-12T22:11:12.121780467Z" level=info msg="CreateContainer within sandbox \"3ddefe259f3013509acb9cfb8dd0818ddaeb442cb74121ae902e4a5d2188d8d8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226\"" Sep 12 22:11:12.122957 containerd[1538]: time="2025-09-12T22:11:12.122778297Z" level=info msg="StartContainer for \"bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226\"" Sep 12 22:11:12.125191 containerd[1538]: time="2025-09-12T22:11:12.125153052Z" level=info msg="connecting to shim bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226" address="unix:///run/containerd/s/b14752036548c8c3583cc732b99f98b523d864d0bfe33926975402ef188f1df0" protocol=ttrpc version=3 Sep 12 22:11:12.127123 systemd[1]: Started cri-containerd-3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654.scope - libcontainer container 3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654. Sep 12 22:11:12.140157 systemd[1]: Started cri-containerd-84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca.scope - libcontainer container 84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca. Sep 12 22:11:12.143590 systemd[1]: Started cri-containerd-bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226.scope - libcontainer container bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226. Sep 12 22:11:12.163536 kubelet[2318]: I0912 22:11:12.163221 2318 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:11:12.163651 kubelet[2318]: E0912 22:11:12.163587 2318 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.61:6443/api/v1/nodes\": dial tcp 10.0.0.61:6443: connect: connection refused" node="localhost" Sep 12 22:11:12.196792 containerd[1538]: time="2025-09-12T22:11:12.196726278Z" level=info msg="StartContainer for \"3c8fd05f1b68257047b6c1ece71967ec226d28f31054bc127231852b6dee3654\" returns successfully" Sep 12 22:11:12.197482 containerd[1538]: time="2025-09-12T22:11:12.197365873Z" level=info msg="StartContainer for \"84ec1d46df504d71af42e10e9d8870cc37c4efbe1a0a23e1059cf59522a204ca\" returns successfully" Sep 12 22:11:12.198567 containerd[1538]: time="2025-09-12T22:11:12.198533271Z" level=info msg="StartContainer for \"bcddb1c50f9b871c2adf80e3dbd4babb0452968ca57e2e14eb4c429447862226\" returns successfully" Sep 12 22:11:12.323585 kubelet[2318]: E0912 22:11:12.323421 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:12.325858 kubelet[2318]: E0912 22:11:12.325829 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:12.329874 kubelet[2318]: E0912 22:11:12.329844 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:12.965686 kubelet[2318]: I0912 22:11:12.965636 2318 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:11:13.330849 kubelet[2318]: E0912 22:11:13.330445 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:13.330849 kubelet[2318]: E0912 22:11:13.330201 2318 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:11:13.769210 kubelet[2318]: E0912 22:11:13.768898 2318 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 22:11:13.850043 kubelet[2318]: I0912 22:11:13.849980 2318 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 22:11:13.850043 kubelet[2318]: E0912 22:11:13.850028 2318 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 12 22:11:13.893973 kubelet[2318]: I0912 22:11:13.893848 2318 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:13.901033 kubelet[2318]: E0912 22:11:13.900990 2318 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:13.901033 kubelet[2318]: I0912 22:11:13.901024 2318 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:13.903331 kubelet[2318]: E0912 22:11:13.903272 2318 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:13.903331 kubelet[2318]: I0912 22:11:13.903307 2318 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:13.907057 kubelet[2318]: E0912 22:11:13.906994 2318 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:14.285684 kubelet[2318]: I0912 22:11:14.285643 2318 apiserver.go:52] "Watching apiserver" Sep 12 22:11:14.294677 kubelet[2318]: I0912 22:11:14.294624 2318 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 22:11:14.331500 kubelet[2318]: I0912 22:11:14.331254 2318 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:14.333630 kubelet[2318]: E0912 22:11:14.333431 2318 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:16.102132 systemd[1]: Reload requested from client PID 2590 ('systemctl') (unit session-7.scope)... Sep 12 22:11:16.102150 systemd[1]: Reloading... Sep 12 22:11:16.177994 zram_generator::config[2634]: No configuration found. Sep 12 22:11:16.350093 systemd[1]: Reloading finished in 247 ms. Sep 12 22:11:16.370604 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:16.387395 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:11:16.387649 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:16.387709 systemd[1]: kubelet.service: Consumed 2.004s CPU time, 128M memory peak. Sep 12 22:11:16.389620 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:16.526282 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:16.540308 (kubelet)[2675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:11:16.584976 kubelet[2675]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:11:16.584976 kubelet[2675]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 22:11:16.584976 kubelet[2675]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:11:16.584976 kubelet[2675]: I0912 22:11:16.584844 2675 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:11:16.591941 kubelet[2675]: I0912 22:11:16.591601 2675 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 22:11:16.591941 kubelet[2675]: I0912 22:11:16.591637 2675 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:11:16.592102 kubelet[2675]: I0912 22:11:16.591909 2675 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 22:11:16.593625 kubelet[2675]: I0912 22:11:16.593595 2675 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 22:11:16.596230 kubelet[2675]: I0912 22:11:16.596188 2675 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:11:16.600153 kubelet[2675]: I0912 22:11:16.600130 2675 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:11:16.604301 kubelet[2675]: I0912 22:11:16.604263 2675 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:11:16.604518 kubelet[2675]: I0912 22:11:16.604480 2675 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:11:16.604785 kubelet[2675]: I0912 22:11:16.604520 2675 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:11:16.604877 kubelet[2675]: I0912 22:11:16.604789 2675 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:11:16.604877 kubelet[2675]: I0912 22:11:16.604799 2675 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 22:11:16.604877 kubelet[2675]: I0912 22:11:16.604844 2675 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:11:16.605074 kubelet[2675]: I0912 22:11:16.605051 2675 kubelet.go:446] "Attempting to sync node with API server" Sep 12 22:11:16.606375 kubelet[2675]: I0912 22:11:16.606338 2675 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:11:16.606437 kubelet[2675]: I0912 22:11:16.606396 2675 kubelet.go:352] "Adding apiserver pod source" Sep 12 22:11:16.606437 kubelet[2675]: I0912 22:11:16.606410 2675 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:11:16.611141 kubelet[2675]: I0912 22:11:16.611113 2675 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:11:16.611776 kubelet[2675]: I0912 22:11:16.611758 2675 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:11:16.612364 kubelet[2675]: I0912 22:11:16.612343 2675 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 22:11:16.612473 kubelet[2675]: I0912 22:11:16.612462 2675 server.go:1287] "Started kubelet" Sep 12 22:11:16.613210 kubelet[2675]: I0912 22:11:16.613162 2675 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:11:16.613397 kubelet[2675]: I0912 22:11:16.613356 2675 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:11:16.613545 kubelet[2675]: I0912 22:11:16.613518 2675 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:11:16.614245 kubelet[2675]: I0912 22:11:16.614217 2675 server.go:479] "Adding debug handlers to kubelet server" Sep 12 22:11:16.614928 kubelet[2675]: I0912 22:11:16.614762 2675 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:11:16.618018 kubelet[2675]: I0912 22:11:16.617984 2675 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:11:16.620410 kubelet[2675]: E0912 22:11:16.620360 2675 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:11:16.620801 kubelet[2675]: I0912 22:11:16.620555 2675 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 22:11:16.620801 kubelet[2675]: I0912 22:11:16.620748 2675 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 22:11:16.621262 kubelet[2675]: I0912 22:11:16.621158 2675 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:11:16.622410 kubelet[2675]: E0912 22:11:16.622362 2675 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:11:16.629477 kubelet[2675]: I0912 22:11:16.629438 2675 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:11:16.634090 kubelet[2675]: I0912 22:11:16.634054 2675 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:11:16.634090 kubelet[2675]: I0912 22:11:16.634089 2675 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:11:16.647423 kubelet[2675]: I0912 22:11:16.647366 2675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:11:16.648522 kubelet[2675]: I0912 22:11:16.648490 2675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:11:16.648522 kubelet[2675]: I0912 22:11:16.648517 2675 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 22:11:16.648704 kubelet[2675]: I0912 22:11:16.648536 2675 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 22:11:16.648704 kubelet[2675]: I0912 22:11:16.648543 2675 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 22:11:16.648704 kubelet[2675]: E0912 22:11:16.648582 2675 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:11:16.672211 kubelet[2675]: I0912 22:11:16.672181 2675 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 22:11:16.672211 kubelet[2675]: I0912 22:11:16.672202 2675 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 22:11:16.672365 kubelet[2675]: I0912 22:11:16.672225 2675 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:11:16.672434 kubelet[2675]: I0912 22:11:16.672411 2675 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:11:16.672476 kubelet[2675]: I0912 22:11:16.672431 2675 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:11:16.672476 kubelet[2675]: I0912 22:11:16.672452 2675 policy_none.go:49] "None policy: Start" Sep 12 22:11:16.672476 kubelet[2675]: I0912 22:11:16.672466 2675 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 22:11:16.672476 kubelet[2675]: I0912 22:11:16.672477 2675 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:11:16.672599 kubelet[2675]: I0912 22:11:16.672587 2675 state_mem.go:75] "Updated machine memory state" Sep 12 22:11:16.677066 kubelet[2675]: I0912 22:11:16.677022 2675 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:11:16.677455 kubelet[2675]: I0912 22:11:16.677227 2675 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:11:16.677455 kubelet[2675]: I0912 22:11:16.677249 2675 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:11:16.677455 kubelet[2675]: I0912 22:11:16.677453 2675 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:11:16.678384 kubelet[2675]: E0912 22:11:16.678362 2675 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 22:11:16.749228 kubelet[2675]: I0912 22:11:16.749174 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:16.749439 kubelet[2675]: I0912 22:11:16.749407 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:16.749496 kubelet[2675]: I0912 22:11:16.749442 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:16.779288 kubelet[2675]: I0912 22:11:16.779163 2675 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:11:16.791579 kubelet[2675]: I0912 22:11:16.791327 2675 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 22:11:16.791579 kubelet[2675]: I0912 22:11:16.791423 2675 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 22:11:16.822397 kubelet[2675]: I0912 22:11:16.822359 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:16.822397 kubelet[2675]: I0912 22:11:16.822396 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:16.822397 kubelet[2675]: I0912 22:11:16.822417 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:16.822963 kubelet[2675]: I0912 22:11:16.822437 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:16.822963 kubelet[2675]: I0912 22:11:16.822545 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:16.822963 kubelet[2675]: I0912 22:11:16.822605 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4e2fffe60e4b8ad4081aa4b1235f3c8c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"4e2fffe60e4b8ad4081aa4b1235f3c8c\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:11:16.822963 kubelet[2675]: I0912 22:11:16.822631 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:16.822963 kubelet[2675]: I0912 22:11:16.822648 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:16.823339 kubelet[2675]: I0912 22:11:16.822671 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:11:17.608669 kubelet[2675]: I0912 22:11:17.607383 2675 apiserver.go:52] "Watching apiserver" Sep 12 22:11:17.620833 kubelet[2675]: I0912 22:11:17.620800 2675 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 22:11:17.659267 kubelet[2675]: I0912 22:11:17.659138 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:17.664946 kubelet[2675]: E0912 22:11:17.664778 2675 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 22:11:17.684450 kubelet[2675]: I0912 22:11:17.684302 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.684287066 podStartE2EDuration="1.684287066s" podCreationTimestamp="2025-09-12 22:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:11:17.682890302 +0000 UTC m=+1.136903265" watchObservedRunningTime="2025-09-12 22:11:17.684287066 +0000 UTC m=+1.138300069" Sep 12 22:11:17.690706 kubelet[2675]: I0912 22:11:17.690643 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.690593097 podStartE2EDuration="1.690593097s" podCreationTimestamp="2025-09-12 22:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:11:17.690309533 +0000 UTC m=+1.144322537" watchObservedRunningTime="2025-09-12 22:11:17.690593097 +0000 UTC m=+1.144606100" Sep 12 22:11:17.698929 kubelet[2675]: I0912 22:11:17.698827 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.6988139100000001 podStartE2EDuration="1.69881391s" podCreationTimestamp="2025-09-12 22:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:11:17.698783532 +0000 UTC m=+1.152796535" watchObservedRunningTime="2025-09-12 22:11:17.69881391 +0000 UTC m=+1.152826873" Sep 12 22:11:22.739092 kubelet[2675]: I0912 22:11:22.739042 2675 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:11:22.739451 containerd[1538]: time="2025-09-12T22:11:22.739358379Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:11:22.739631 kubelet[2675]: I0912 22:11:22.739518 2675 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:11:23.512372 systemd[1]: Created slice kubepods-besteffort-pod1d901530_e186_45e2_a8c6_7a431f319d8c.slice - libcontainer container kubepods-besteffort-pod1d901530_e186_45e2_a8c6_7a431f319d8c.slice. Sep 12 22:11:23.567925 kubelet[2675]: I0912 22:11:23.567864 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1d901530-e186-45e2-a8c6-7a431f319d8c-xtables-lock\") pod \"kube-proxy-mvdp4\" (UID: \"1d901530-e186-45e2-a8c6-7a431f319d8c\") " pod="kube-system/kube-proxy-mvdp4" Sep 12 22:11:23.567925 kubelet[2675]: I0912 22:11:23.567907 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d901530-e186-45e2-a8c6-7a431f319d8c-lib-modules\") pod \"kube-proxy-mvdp4\" (UID: \"1d901530-e186-45e2-a8c6-7a431f319d8c\") " pod="kube-system/kube-proxy-mvdp4" Sep 12 22:11:23.568109 kubelet[2675]: I0912 22:11:23.567939 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv2qg\" (UniqueName: \"kubernetes.io/projected/1d901530-e186-45e2-a8c6-7a431f319d8c-kube-api-access-pv2qg\") pod \"kube-proxy-mvdp4\" (UID: \"1d901530-e186-45e2-a8c6-7a431f319d8c\") " pod="kube-system/kube-proxy-mvdp4" Sep 12 22:11:23.568109 kubelet[2675]: I0912 22:11:23.567955 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1d901530-e186-45e2-a8c6-7a431f319d8c-kube-proxy\") pod \"kube-proxy-mvdp4\" (UID: \"1d901530-e186-45e2-a8c6-7a431f319d8c\") " pod="kube-system/kube-proxy-mvdp4" Sep 12 22:11:23.824505 containerd[1538]: time="2025-09-12T22:11:23.824386431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mvdp4,Uid:1d901530-e186-45e2-a8c6-7a431f319d8c,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:24.228621 containerd[1538]: time="2025-09-12T22:11:24.228058755Z" level=info msg="connecting to shim 97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6" address="unix:///run/containerd/s/bfdc2af62e8868d2e9fc90a458916881d4ad6be5f1c0a936bb037ff184169d98" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:24.242022 systemd[1]: Created slice kubepods-besteffort-poda4f61223_e23e_49f0_bb67_faaa28907b21.slice - libcontainer container kubepods-besteffort-poda4f61223_e23e_49f0_bb67_faaa28907b21.slice. Sep 12 22:11:24.258161 systemd[1]: Started cri-containerd-97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6.scope - libcontainer container 97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6. Sep 12 22:11:24.270461 kubelet[2675]: I0912 22:11:24.270202 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a4f61223-e23e-49f0-bb67-faaa28907b21-var-lib-calico\") pod \"tigera-operator-755d956888-gh2l9\" (UID: \"a4f61223-e23e-49f0-bb67-faaa28907b21\") " pod="tigera-operator/tigera-operator-755d956888-gh2l9" Sep 12 22:11:24.270461 kubelet[2675]: I0912 22:11:24.270245 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfm7\" (UniqueName: \"kubernetes.io/projected/a4f61223-e23e-49f0-bb67-faaa28907b21-kube-api-access-8kfm7\") pod \"tigera-operator-755d956888-gh2l9\" (UID: \"a4f61223-e23e-49f0-bb67-faaa28907b21\") " pod="tigera-operator/tigera-operator-755d956888-gh2l9" Sep 12 22:11:24.281543 containerd[1538]: time="2025-09-12T22:11:24.281498049Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mvdp4,Uid:1d901530-e186-45e2-a8c6-7a431f319d8c,Namespace:kube-system,Attempt:0,} returns sandbox id \"97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6\"" Sep 12 22:11:24.286155 containerd[1538]: time="2025-09-12T22:11:24.286115634Z" level=info msg="CreateContainer within sandbox \"97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:11:24.295528 containerd[1538]: time="2025-09-12T22:11:24.295487943Z" level=info msg="Container 6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:24.296517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2299994569.mount: Deactivated successfully. Sep 12 22:11:24.303205 containerd[1538]: time="2025-09-12T22:11:24.303169846Z" level=info msg="CreateContainer within sandbox \"97007638827029976dba1cd05237ba955912839d5f27680a0ff064b64f82e3e6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917\"" Sep 12 22:11:24.303717 containerd[1538]: time="2025-09-12T22:11:24.303697272Z" level=info msg="StartContainer for \"6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917\"" Sep 12 22:11:24.305791 containerd[1538]: time="2025-09-12T22:11:24.305087430Z" level=info msg="connecting to shim 6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917" address="unix:///run/containerd/s/bfdc2af62e8868d2e9fc90a458916881d4ad6be5f1c0a936bb037ff184169d98" protocol=ttrpc version=3 Sep 12 22:11:24.324119 systemd[1]: Started cri-containerd-6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917.scope - libcontainer container 6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917. Sep 12 22:11:24.358683 containerd[1538]: time="2025-09-12T22:11:24.358647896Z" level=info msg="StartContainer for \"6243692e996b8bb5edfc0d3cbe627455860803824b28a85f0d11d8a3bb0d1917\" returns successfully" Sep 12 22:11:24.545731 containerd[1538]: time="2025-09-12T22:11:24.545626999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gh2l9,Uid:a4f61223-e23e-49f0-bb67-faaa28907b21,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:11:24.560456 containerd[1538]: time="2025-09-12T22:11:24.560406793Z" level=info msg="connecting to shim 63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef" address="unix:///run/containerd/s/ff6a5220b94dad2be53f25ddb69bf8a0d12ee969fd5c533c539a24744eeb33a6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:24.584058 systemd[1]: Started cri-containerd-63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef.scope - libcontainer container 63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef. Sep 12 22:11:24.612926 containerd[1538]: time="2025-09-12T22:11:24.612820646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gh2l9,Uid:a4f61223-e23e-49f0-bb67-faaa28907b21,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef\"" Sep 12 22:11:24.624571 containerd[1538]: time="2025-09-12T22:11:24.624522357Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:11:26.484090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3034091526.mount: Deactivated successfully. Sep 12 22:11:26.591843 kubelet[2675]: I0912 22:11:26.591776 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mvdp4" podStartSLOduration=3.591758833 podStartE2EDuration="3.591758833s" podCreationTimestamp="2025-09-12 22:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:11:24.696867698 +0000 UTC m=+8.150880701" watchObservedRunningTime="2025-09-12 22:11:26.591758833 +0000 UTC m=+10.045771796" Sep 12 22:11:26.917709 containerd[1538]: time="2025-09-12T22:11:26.917610278Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:26.918739 containerd[1538]: time="2025-09-12T22:11:26.918556523Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 22:11:26.919410 containerd[1538]: time="2025-09-12T22:11:26.919385844Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:26.921514 containerd[1538]: time="2025-09-12T22:11:26.921482053Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:26.922242 containerd[1538]: time="2025-09-12T22:11:26.922197929Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.297626592s" Sep 12 22:11:26.922242 containerd[1538]: time="2025-09-12T22:11:26.922227460Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 22:11:26.926026 containerd[1538]: time="2025-09-12T22:11:26.926003918Z" level=info msg="CreateContainer within sandbox \"63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:11:26.936086 containerd[1538]: time="2025-09-12T22:11:26.936047354Z" level=info msg="Container 623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:26.940693 containerd[1538]: time="2025-09-12T22:11:26.940608154Z" level=info msg="CreateContainer within sandbox \"63813d8f6eab1dfac004bc32d7185902d4c88858128325734986a3b0c2c508ef\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc\"" Sep 12 22:11:26.941661 containerd[1538]: time="2025-09-12T22:11:26.941073134Z" level=info msg="StartContainer for \"623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc\"" Sep 12 22:11:26.942634 containerd[1538]: time="2025-09-12T22:11:26.942602324Z" level=info msg="connecting to shim 623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc" address="unix:///run/containerd/s/ff6a5220b94dad2be53f25ddb69bf8a0d12ee969fd5c533c539a24744eeb33a6" protocol=ttrpc version=3 Sep 12 22:11:26.976069 systemd[1]: Started cri-containerd-623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc.scope - libcontainer container 623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc. Sep 12 22:11:27.008707 containerd[1538]: time="2025-09-12T22:11:27.008669288Z" level=info msg="StartContainer for \"623338681180bdbe9f32adf712f9cca5d7809621ae0720e49dc5f9a42572b5cc\" returns successfully" Sep 12 22:11:27.694621 kubelet[2675]: I0912 22:11:27.694567 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-gh2l9" podStartSLOduration=2.385273092 podStartE2EDuration="4.694553376s" podCreationTimestamp="2025-09-12 22:11:23 +0000 UTC" firstStartedPulling="2025-09-12 22:11:24.614665039 +0000 UTC m=+8.068678042" lastFinishedPulling="2025-09-12 22:11:26.923945323 +0000 UTC m=+10.377958326" observedRunningTime="2025-09-12 22:11:27.694148108 +0000 UTC m=+11.148161111" watchObservedRunningTime="2025-09-12 22:11:27.694553376 +0000 UTC m=+11.148566379" Sep 12 22:11:30.545591 update_engine[1512]: I20250912 22:11:30.545512 1512 update_attempter.cc:509] Updating boot flags... Sep 12 22:11:32.424943 sudo[1746]: pam_unix(sudo:session): session closed for user root Sep 12 22:11:32.427969 sshd[1745]: Connection closed by 10.0.0.1 port 54190 Sep 12 22:11:32.429235 sshd-session[1742]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:32.434810 systemd-logind[1508]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:11:32.436217 systemd[1]: sshd@6-10.0.0.61:22-10.0.0.1:54190.service: Deactivated successfully. Sep 12 22:11:32.439892 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:11:32.443014 systemd[1]: session-7.scope: Consumed 7.955s CPU time, 220.3M memory peak. Sep 12 22:11:32.446341 systemd-logind[1508]: Removed session 7. Sep 12 22:11:35.765146 systemd[1]: Created slice kubepods-besteffort-podba4291c9_1d39_4300_8fa6_5228f478dbdb.slice - libcontainer container kubepods-besteffort-podba4291c9_1d39_4300_8fa6_5228f478dbdb.slice. Sep 12 22:11:35.859377 kubelet[2675]: I0912 22:11:35.859207 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhgs\" (UniqueName: \"kubernetes.io/projected/ba4291c9-1d39-4300-8fa6-5228f478dbdb-kube-api-access-9lhgs\") pod \"calico-typha-66dfcf49f9-2m4lg\" (UID: \"ba4291c9-1d39-4300-8fa6-5228f478dbdb\") " pod="calico-system/calico-typha-66dfcf49f9-2m4lg" Sep 12 22:11:35.859866 kubelet[2675]: I0912 22:11:35.859756 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba4291c9-1d39-4300-8fa6-5228f478dbdb-tigera-ca-bundle\") pod \"calico-typha-66dfcf49f9-2m4lg\" (UID: \"ba4291c9-1d39-4300-8fa6-5228f478dbdb\") " pod="calico-system/calico-typha-66dfcf49f9-2m4lg" Sep 12 22:11:35.859866 kubelet[2675]: I0912 22:11:35.859805 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ba4291c9-1d39-4300-8fa6-5228f478dbdb-typha-certs\") pod \"calico-typha-66dfcf49f9-2m4lg\" (UID: \"ba4291c9-1d39-4300-8fa6-5228f478dbdb\") " pod="calico-system/calico-typha-66dfcf49f9-2m4lg" Sep 12 22:11:35.885305 systemd[1]: Created slice kubepods-besteffort-pod0f56cb82_04c7_41f9_9a76_abfc37e6d673.slice - libcontainer container kubepods-besteffort-pod0f56cb82_04c7_41f9_9a76_abfc37e6d673.slice. Sep 12 22:11:35.960660 kubelet[2675]: I0912 22:11:35.960611 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-policysync\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.960660 kubelet[2675]: I0912 22:11:35.960687 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f56cb82-04c7-41f9-9a76-abfc37e6d673-tigera-ca-bundle\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961022 kubelet[2675]: I0912 22:11:35.960724 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-cni-bin-dir\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961022 kubelet[2675]: I0912 22:11:35.960995 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-cni-log-dir\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961172 kubelet[2675]: I0912 22:11:35.961117 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptdcp\" (UniqueName: \"kubernetes.io/projected/0f56cb82-04c7-41f9-9a76-abfc37e6d673-kube-api-access-ptdcp\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961254 kubelet[2675]: I0912 22:11:35.961147 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-flexvol-driver-host\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961348 kubelet[2675]: I0912 22:11:35.961314 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-lib-modules\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961461 kubelet[2675]: I0912 22:11:35.961392 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-cni-net-dir\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961461 kubelet[2675]: I0912 22:11:35.961410 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0f56cb82-04c7-41f9-9a76-abfc37e6d673-node-certs\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961594 kubelet[2675]: I0912 22:11:35.961536 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-var-run-calico\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961594 kubelet[2675]: I0912 22:11:35.961563 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-var-lib-calico\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:35.961594 kubelet[2675]: I0912 22:11:35.961577 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0f56cb82-04c7-41f9-9a76-abfc37e6d673-xtables-lock\") pod \"calico-node-8d8ll\" (UID: \"0f56cb82-04c7-41f9-9a76-abfc37e6d673\") " pod="calico-system/calico-node-8d8ll" Sep 12 22:11:36.063779 kubelet[2675]: E0912 22:11:36.063695 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.063779 kubelet[2675]: W0912 22:11:36.063718 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.065792 kubelet[2675]: E0912 22:11:36.065635 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.065792 kubelet[2675]: W0912 22:11:36.065728 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.065792 kubelet[2675]: E0912 22:11:36.065756 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.066360 kubelet[2675]: E0912 22:11:36.066324 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.071460 containerd[1538]: time="2025-09-12T22:11:36.071411698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66dfcf49f9-2m4lg,Uid:ba4291c9-1d39-4300-8fa6-5228f478dbdb,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:36.074449 kubelet[2675]: E0912 22:11:36.074432 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.074449 kubelet[2675]: W0912 22:11:36.074448 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.074532 kubelet[2675]: E0912 22:11:36.074462 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.111644 kubelet[2675]: E0912 22:11:36.111266 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xhvwr" podUID="2f3109b1-2f5e-4b92-ad32-113a5ade0713" Sep 12 22:11:36.124088 containerd[1538]: time="2025-09-12T22:11:36.124040386Z" level=info msg="connecting to shim 2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa" address="unix:///run/containerd/s/636ef0787b7dc8ef9b64e1927f0c633816021b9ff2b1f648e7e4ce60a6ff507b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:36.131950 kubelet[2675]: E0912 22:11:36.131743 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.131950 kubelet[2675]: W0912 22:11:36.131764 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.131950 kubelet[2675]: E0912 22:11:36.131785 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.131950 kubelet[2675]: E0912 22:11:36.131939 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.132255 kubelet[2675]: W0912 22:11:36.131947 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.132255 kubelet[2675]: E0912 22:11:36.131996 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.132385 kubelet[2675]: E0912 22:11:36.132368 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.132385 kubelet[2675]: W0912 22:11:36.132382 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.132442 kubelet[2675]: E0912 22:11:36.132394 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.132746 kubelet[2675]: E0912 22:11:36.132716 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.132746 kubelet[2675]: W0912 22:11:36.132729 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.132746 kubelet[2675]: E0912 22:11:36.132739 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.132928 kubelet[2675]: E0912 22:11:36.132903 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.132983 kubelet[2675]: W0912 22:11:36.132933 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.132983 kubelet[2675]: E0912 22:11:36.132942 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133129 kubelet[2675]: E0912 22:11:36.133093 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133129 kubelet[2675]: W0912 22:11:36.133118 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133129 kubelet[2675]: E0912 22:11:36.133126 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133333 kubelet[2675]: E0912 22:11:36.133269 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133333 kubelet[2675]: W0912 22:11:36.133277 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133333 kubelet[2675]: E0912 22:11:36.133285 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133549 kubelet[2675]: E0912 22:11:36.133406 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133549 kubelet[2675]: W0912 22:11:36.133413 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133549 kubelet[2675]: E0912 22:11:36.133420 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133549 kubelet[2675]: E0912 22:11:36.133540 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133549 kubelet[2675]: W0912 22:11:36.133547 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133664 kubelet[2675]: E0912 22:11:36.133555 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133688 kubelet[2675]: E0912 22:11:36.133669 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133688 kubelet[2675]: W0912 22:11:36.133677 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133688 kubelet[2675]: E0912 22:11:36.133683 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133810 kubelet[2675]: E0912 22:11:36.133794 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133810 kubelet[2675]: W0912 22:11:36.133806 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133943 kubelet[2675]: E0912 22:11:36.133813 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.133978 kubelet[2675]: E0912 22:11:36.133955 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.133978 kubelet[2675]: W0912 22:11:36.133963 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.133978 kubelet[2675]: E0912 22:11:36.133970 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.134210 kubelet[2675]: E0912 22:11:36.134176 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.134210 kubelet[2675]: W0912 22:11:36.134188 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.134210 kubelet[2675]: E0912 22:11:36.134209 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.134385 kubelet[2675]: E0912 22:11:36.134372 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.134385 kubelet[2675]: W0912 22:11:36.134382 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.134435 kubelet[2675]: E0912 22:11:36.134390 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.134520 kubelet[2675]: E0912 22:11:36.134509 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.134520 kubelet[2675]: W0912 22:11:36.134518 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.134582 kubelet[2675]: E0912 22:11:36.134526 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.134727 kubelet[2675]: E0912 22:11:36.134717 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.134757 kubelet[2675]: W0912 22:11:36.134729 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.134757 kubelet[2675]: E0912 22:11:36.134736 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.134890 kubelet[2675]: E0912 22:11:36.134877 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.134890 kubelet[2675]: W0912 22:11:36.134888 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.135080 kubelet[2675]: E0912 22:11:36.134896 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.135080 kubelet[2675]: E0912 22:11:36.135046 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.135080 kubelet[2675]: W0912 22:11:36.135054 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.135080 kubelet[2675]: E0912 22:11:36.135062 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.135301 kubelet[2675]: E0912 22:11:36.135177 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.135301 kubelet[2675]: W0912 22:11:36.135184 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.135301 kubelet[2675]: E0912 22:11:36.135199 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.135441 kubelet[2675]: E0912 22:11:36.135322 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.135441 kubelet[2675]: W0912 22:11:36.135329 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.135441 kubelet[2675]: E0912 22:11:36.135336 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.163938 kubelet[2675]: E0912 22:11:36.163505 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.163938 kubelet[2675]: W0912 22:11:36.163528 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.163938 kubelet[2675]: E0912 22:11:36.163549 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.163938 kubelet[2675]: I0912 22:11:36.163576 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2f3109b1-2f5e-4b92-ad32-113a5ade0713-socket-dir\") pod \"csi-node-driver-xhvwr\" (UID: \"2f3109b1-2f5e-4b92-ad32-113a5ade0713\") " pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:36.163938 kubelet[2675]: E0912 22:11:36.163769 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.163938 kubelet[2675]: W0912 22:11:36.163779 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.163938 kubelet[2675]: E0912 22:11:36.163797 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.163938 kubelet[2675]: I0912 22:11:36.163813 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2f3109b1-2f5e-4b92-ad32-113a5ade0713-varrun\") pod \"csi-node-driver-xhvwr\" (UID: \"2f3109b1-2f5e-4b92-ad32-113a5ade0713\") " pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:36.164510 kubelet[2675]: E0912 22:11:36.164486 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.164510 kubelet[2675]: W0912 22:11:36.164503 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.164582 kubelet[2675]: E0912 22:11:36.164529 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.164582 kubelet[2675]: I0912 22:11:36.164551 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2f3109b1-2f5e-4b92-ad32-113a5ade0713-registration-dir\") pod \"csi-node-driver-xhvwr\" (UID: \"2f3109b1-2f5e-4b92-ad32-113a5ade0713\") " pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:36.165183 kubelet[2675]: E0912 22:11:36.165158 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.165183 kubelet[2675]: W0912 22:11:36.165176 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.165183 kubelet[2675]: E0912 22:11:36.165195 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.165183 kubelet[2675]: I0912 22:11:36.165223 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnhx\" (UniqueName: \"kubernetes.io/projected/2f3109b1-2f5e-4b92-ad32-113a5ade0713-kube-api-access-wgnhx\") pod \"csi-node-driver-xhvwr\" (UID: \"2f3109b1-2f5e-4b92-ad32-113a5ade0713\") " pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:36.166276 kubelet[2675]: E0912 22:11:36.166251 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.166276 kubelet[2675]: W0912 22:11:36.166270 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.166805 kubelet[2675]: E0912 22:11:36.166315 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.166805 kubelet[2675]: I0912 22:11:36.166650 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f3109b1-2f5e-4b92-ad32-113a5ade0713-kubelet-dir\") pod \"csi-node-driver-xhvwr\" (UID: \"2f3109b1-2f5e-4b92-ad32-113a5ade0713\") " pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:36.167366 kubelet[2675]: E0912 22:11:36.167082 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.167366 kubelet[2675]: W0912 22:11:36.167101 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.167366 kubelet[2675]: E0912 22:11:36.167332 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.167938 kubelet[2675]: E0912 22:11:36.167867 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.167938 kubelet[2675]: W0912 22:11:36.167881 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.168024 kubelet[2675]: E0912 22:11:36.167980 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.169112 kubelet[2675]: E0912 22:11:36.169026 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.169112 kubelet[2675]: W0912 22:11:36.169051 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.169112 kubelet[2675]: E0912 22:11:36.169101 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.170090 kubelet[2675]: E0912 22:11:36.170058 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.170090 kubelet[2675]: W0912 22:11:36.170075 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.171270 kubelet[2675]: E0912 22:11:36.170135 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.171270 kubelet[2675]: E0912 22:11:36.170637 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.171270 kubelet[2675]: W0912 22:11:36.170651 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.171270 kubelet[2675]: E0912 22:11:36.170699 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.172043 kubelet[2675]: E0912 22:11:36.172017 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.172043 kubelet[2675]: W0912 22:11:36.172036 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.172131 kubelet[2675]: E0912 22:11:36.172051 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.172442 kubelet[2675]: E0912 22:11:36.172417 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.172442 kubelet[2675]: W0912 22:11:36.172434 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.172442 kubelet[2675]: E0912 22:11:36.172445 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.172777 kubelet[2675]: E0912 22:11:36.172759 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.172777 kubelet[2675]: W0912 22:11:36.172773 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.172856 kubelet[2675]: E0912 22:11:36.172783 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.173349 kubelet[2675]: E0912 22:11:36.173326 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.173349 kubelet[2675]: W0912 22:11:36.173343 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.173671 kubelet[2675]: E0912 22:11:36.173355 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.174087 kubelet[2675]: E0912 22:11:36.174064 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.174087 kubelet[2675]: W0912 22:11:36.174080 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.174171 kubelet[2675]: E0912 22:11:36.174092 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.189640 containerd[1538]: time="2025-09-12T22:11:36.189585820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8d8ll,Uid:0f56cb82-04c7-41f9-9a76-abfc37e6d673,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:36.204156 systemd[1]: Started cri-containerd-2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa.scope - libcontainer container 2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa. Sep 12 22:11:36.265345 containerd[1538]: time="2025-09-12T22:11:36.265292794Z" level=info msg="connecting to shim 2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b" address="unix:///run/containerd/s/c3d35a362611aa91e8a40054945e010fc4cce256bbca298288d9c0759f66cc66" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.268195 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.269945 kubelet[2675]: W0912 22:11:36.268217 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.268253 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.268954 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.269945 kubelet[2675]: W0912 22:11:36.268967 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.268988 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.269273 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.269945 kubelet[2675]: W0912 22:11:36.269289 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.269945 kubelet[2675]: E0912 22:11:36.269320 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.270234 kubelet[2675]: E0912 22:11:36.269996 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.270234 kubelet[2675]: W0912 22:11:36.270009 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.270234 kubelet[2675]: E0912 22:11:36.270023 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.271951 kubelet[2675]: E0912 22:11:36.271196 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.271951 kubelet[2675]: W0912 22:11:36.271228 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.271951 kubelet[2675]: E0912 22:11:36.271248 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.271951 kubelet[2675]: E0912 22:11:36.271733 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.271951 kubelet[2675]: W0912 22:11:36.271745 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.271951 kubelet[2675]: E0912 22:11:36.271813 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.272215 kubelet[2675]: E0912 22:11:36.272046 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.272215 kubelet[2675]: W0912 22:11:36.272062 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.272215 kubelet[2675]: E0912 22:11:36.272108 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.272405 kubelet[2675]: E0912 22:11:36.272386 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.272405 kubelet[2675]: W0912 22:11:36.272398 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.272405 kubelet[2675]: E0912 22:11:36.272413 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.272889 kubelet[2675]: E0912 22:11:36.272871 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.272889 kubelet[2675]: W0912 22:11:36.272885 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.273247 kubelet[2675]: E0912 22:11:36.272901 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.273615 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.275958 kubelet[2675]: W0912 22:11:36.273632 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.273818 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.274592 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.275958 kubelet[2675]: W0912 22:11:36.274605 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.274671 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.274960 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.275958 kubelet[2675]: W0912 22:11:36.274971 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.275029 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.275958 kubelet[2675]: E0912 22:11:36.275133 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.276260 kubelet[2675]: W0912 22:11:36.275142 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.276260 kubelet[2675]: E0912 22:11:36.275181 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.276260 kubelet[2675]: E0912 22:11:36.275649 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.276260 kubelet[2675]: W0912 22:11:36.275662 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.276260 kubelet[2675]: E0912 22:11:36.275747 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.276260 kubelet[2675]: E0912 22:11:36.275977 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.276260 kubelet[2675]: W0912 22:11:36.275989 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.276260 kubelet[2675]: E0912 22:11:36.276007 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.276673 kubelet[2675]: E0912 22:11:36.276633 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.276673 kubelet[2675]: W0912 22:11:36.276649 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.276782 kubelet[2675]: E0912 22:11:36.276689 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.277001 kubelet[2675]: E0912 22:11:36.276978 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.277001 kubelet[2675]: W0912 22:11:36.276992 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.277185 kubelet[2675]: E0912 22:11:36.277022 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278157 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.278990 kubelet[2675]: W0912 22:11:36.278180 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278257 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278371 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.278990 kubelet[2675]: W0912 22:11:36.278381 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278443 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278709 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.278990 kubelet[2675]: W0912 22:11:36.278721 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.278990 kubelet[2675]: E0912 22:11:36.278807 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.279688 kubelet[2675]: E0912 22:11:36.279509 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.279688 kubelet[2675]: W0912 22:11:36.279526 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.279688 kubelet[2675]: E0912 22:11:36.279561 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.280160 kubelet[2675]: E0912 22:11:36.279839 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.280160 kubelet[2675]: W0912 22:11:36.279853 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.280160 kubelet[2675]: E0912 22:11:36.279909 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.280472 kubelet[2675]: E0912 22:11:36.280311 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.280472 kubelet[2675]: W0912 22:11:36.280331 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.280472 kubelet[2675]: E0912 22:11:36.280348 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.280656 kubelet[2675]: E0912 22:11:36.280553 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.280656 kubelet[2675]: W0912 22:11:36.280566 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.280656 kubelet[2675]: E0912 22:11:36.280585 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.280832 kubelet[2675]: E0912 22:11:36.280760 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.280832 kubelet[2675]: W0912 22:11:36.280770 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.280832 kubelet[2675]: E0912 22:11:36.280779 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.304178 kubelet[2675]: E0912 22:11:36.304137 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:36.304178 kubelet[2675]: W0912 22:11:36.304157 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:36.304178 kubelet[2675]: E0912 22:11:36.304177 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:36.316126 systemd[1]: Started cri-containerd-2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b.scope - libcontainer container 2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b. Sep 12 22:11:36.360263 containerd[1538]: time="2025-09-12T22:11:36.360178780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66dfcf49f9-2m4lg,Uid:ba4291c9-1d39-4300-8fa6-5228f478dbdb,Namespace:calico-system,Attempt:0,} returns sandbox id \"2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa\"" Sep 12 22:11:36.365370 containerd[1538]: time="2025-09-12T22:11:36.365147584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:11:36.372604 containerd[1538]: time="2025-09-12T22:11:36.372564521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8d8ll,Uid:0f56cb82-04c7-41f9-9a76-abfc37e6d673,Namespace:calico-system,Attempt:0,} returns sandbox id \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\"" Sep 12 22:11:37.276712 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1069679356.mount: Deactivated successfully. Sep 12 22:11:37.653309 kubelet[2675]: E0912 22:11:37.653271 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xhvwr" podUID="2f3109b1-2f5e-4b92-ad32-113a5ade0713" Sep 12 22:11:37.949941 containerd[1538]: time="2025-09-12T22:11:37.949807792Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:37.951170 containerd[1538]: time="2025-09-12T22:11:37.950994778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 22:11:37.951963 containerd[1538]: time="2025-09-12T22:11:37.951891338Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:37.953782 containerd[1538]: time="2025-09-12T22:11:37.953751355Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:37.954394 containerd[1538]: time="2025-09-12T22:11:37.954357410Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.589169377s" Sep 12 22:11:37.954394 containerd[1538]: time="2025-09-12T22:11:37.954387337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 22:11:37.959286 containerd[1538]: time="2025-09-12T22:11:37.959258227Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:11:37.980170 containerd[1538]: time="2025-09-12T22:11:37.980140140Z" level=info msg="CreateContainer within sandbox \"2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:11:37.991317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2700264325.mount: Deactivated successfully. Sep 12 22:11:37.994344 containerd[1538]: time="2025-09-12T22:11:37.994310071Z" level=info msg="Container b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:38.009027 containerd[1538]: time="2025-09-12T22:11:38.008983196Z" level=info msg="CreateContainer within sandbox \"2d71a23fbb74c3312706ce953354fca7fdbf6f17821176162e7fce475f62a2aa\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943\"" Sep 12 22:11:38.009600 containerd[1538]: time="2025-09-12T22:11:38.009568481Z" level=info msg="StartContainer for \"b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943\"" Sep 12 22:11:38.010880 containerd[1538]: time="2025-09-12T22:11:38.010840873Z" level=info msg="connecting to shim b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943" address="unix:///run/containerd/s/636ef0787b7dc8ef9b64e1927f0c633816021b9ff2b1f648e7e4ce60a6ff507b" protocol=ttrpc version=3 Sep 12 22:11:38.031092 systemd[1]: Started cri-containerd-b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943.scope - libcontainer container b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943. Sep 12 22:11:38.073823 containerd[1538]: time="2025-09-12T22:11:38.073782261Z" level=info msg="StartContainer for \"b0a422ac8ca4828d54d8ef4b665995cc827ed95c25b0304d7152e5c783b6e943\" returns successfully" Sep 12 22:11:38.726052 kubelet[2675]: I0912 22:11:38.725701 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-66dfcf49f9-2m4lg" podStartSLOduration=2.129737225 podStartE2EDuration="3.725686114s" podCreationTimestamp="2025-09-12 22:11:35 +0000 UTC" firstStartedPulling="2025-09-12 22:11:36.361778235 +0000 UTC m=+19.815791238" lastFinishedPulling="2025-09-12 22:11:37.957727164 +0000 UTC m=+21.411740127" observedRunningTime="2025-09-12 22:11:38.725603697 +0000 UTC m=+22.179616700" watchObservedRunningTime="2025-09-12 22:11:38.725686114 +0000 UTC m=+22.179699117" Sep 12 22:11:38.751993 kubelet[2675]: E0912 22:11:38.751970 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.752397 kubelet[2675]: W0912 22:11:38.752275 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.752397 kubelet[2675]: E0912 22:11:38.752302 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.752574 kubelet[2675]: E0912 22:11:38.752561 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.752658 kubelet[2675]: W0912 22:11:38.752616 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.752721 kubelet[2675]: E0912 22:11:38.752709 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.752967 kubelet[2675]: E0912 22:11:38.752945 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.753039 kubelet[2675]: W0912 22:11:38.753027 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.753462 kubelet[2675]: E0912 22:11:38.753353 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.753574 kubelet[2675]: E0912 22:11:38.753561 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.753679 kubelet[2675]: W0912 22:11:38.753638 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.753755 kubelet[2675]: E0912 22:11:38.753742 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.754271 kubelet[2675]: E0912 22:11:38.754168 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.754271 kubelet[2675]: W0912 22:11:38.754183 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.754271 kubelet[2675]: E0912 22:11:38.754195 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.754417 kubelet[2675]: E0912 22:11:38.754405 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.754465 kubelet[2675]: W0912 22:11:38.754455 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.754616 kubelet[2675]: E0912 22:11:38.754523 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.755213 kubelet[2675]: E0912 22:11:38.755070 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.755213 kubelet[2675]: W0912 22:11:38.755086 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.755213 kubelet[2675]: E0912 22:11:38.755111 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.755393 kubelet[2675]: E0912 22:11:38.755379 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.755478 kubelet[2675]: W0912 22:11:38.755465 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.755551 kubelet[2675]: E0912 22:11:38.755535 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.755822 kubelet[2675]: E0912 22:11:38.755806 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.755960 kubelet[2675]: W0912 22:11:38.755943 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.756020 kubelet[2675]: E0912 22:11:38.756009 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.756448 kubelet[2675]: E0912 22:11:38.756432 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.756519 kubelet[2675]: W0912 22:11:38.756508 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.756645 kubelet[2675]: E0912 22:11:38.756630 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.756882 kubelet[2675]: E0912 22:11:38.756869 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.756961 kubelet[2675]: W0912 22:11:38.756949 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.757009 kubelet[2675]: E0912 22:11:38.757000 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.757685 kubelet[2675]: E0912 22:11:38.757533 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.757685 kubelet[2675]: W0912 22:11:38.757549 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.757685 kubelet[2675]: E0912 22:11:38.757560 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.758030 kubelet[2675]: E0912 22:11:38.757874 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.758030 kubelet[2675]: W0912 22:11:38.757888 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.758030 kubelet[2675]: E0912 22:11:38.757898 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.758241 kubelet[2675]: E0912 22:11:38.758226 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.758337 kubelet[2675]: W0912 22:11:38.758322 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.758414 kubelet[2675]: E0912 22:11:38.758401 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.758632 kubelet[2675]: E0912 22:11:38.758618 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.758694 kubelet[2675]: W0912 22:11:38.758683 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.758743 kubelet[2675]: E0912 22:11:38.758732 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792066 kubelet[2675]: E0912 22:11:38.792042 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792066 kubelet[2675]: W0912 22:11:38.792062 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.792271 kubelet[2675]: E0912 22:11:38.792078 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792271 kubelet[2675]: E0912 22:11:38.792264 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792271 kubelet[2675]: W0912 22:11:38.792271 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.792336 kubelet[2675]: E0912 22:11:38.792281 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792444 kubelet[2675]: E0912 22:11:38.792432 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792444 kubelet[2675]: W0912 22:11:38.792443 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.792496 kubelet[2675]: E0912 22:11:38.792451 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792648 kubelet[2675]: E0912 22:11:38.792630 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792686 kubelet[2675]: W0912 22:11:38.792646 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.792686 kubelet[2675]: E0912 22:11:38.792665 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792809 kubelet[2675]: E0912 22:11:38.792797 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792809 kubelet[2675]: W0912 22:11:38.792807 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.792870 kubelet[2675]: E0912 22:11:38.792820 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.792976 kubelet[2675]: E0912 22:11:38.792965 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.792976 kubelet[2675]: W0912 22:11:38.792975 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.793020 kubelet[2675]: E0912 22:11:38.792986 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.793165 kubelet[2675]: E0912 22:11:38.793155 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.793210 kubelet[2675]: W0912 22:11:38.793168 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.793210 kubelet[2675]: E0912 22:11:38.793181 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.793570 kubelet[2675]: E0912 22:11:38.793492 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.793570 kubelet[2675]: W0912 22:11:38.793509 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.793570 kubelet[2675]: E0912 22:11:38.793528 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.793831 kubelet[2675]: E0912 22:11:38.793820 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.793888 kubelet[2675]: W0912 22:11:38.793879 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.793994 kubelet[2675]: E0912 22:11:38.793970 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.794163 kubelet[2675]: E0912 22:11:38.794151 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.794257 kubelet[2675]: W0912 22:11:38.794212 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.794257 kubelet[2675]: E0912 22:11:38.794239 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.794457 kubelet[2675]: E0912 22:11:38.794445 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.794625 kubelet[2675]: W0912 22:11:38.794511 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.794625 kubelet[2675]: E0912 22:11:38.794534 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.794751 kubelet[2675]: E0912 22:11:38.794741 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.794811 kubelet[2675]: W0912 22:11:38.794801 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.794875 kubelet[2675]: E0912 22:11:38.794864 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.795058 kubelet[2675]: E0912 22:11:38.795046 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.795058 kubelet[2675]: W0912 22:11:38.795057 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.795143 kubelet[2675]: E0912 22:11:38.795073 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.795207 kubelet[2675]: E0912 22:11:38.795197 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.795207 kubelet[2675]: W0912 22:11:38.795206 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.795377 kubelet[2675]: E0912 22:11:38.795217 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.795459 kubelet[2675]: E0912 22:11:38.795448 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.795505 kubelet[2675]: W0912 22:11:38.795496 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.795672 kubelet[2675]: E0912 22:11:38.795559 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.795762 kubelet[2675]: E0912 22:11:38.795750 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.795762 kubelet[2675]: W0912 22:11:38.795760 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.795825 kubelet[2675]: E0912 22:11:38.795769 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.796066 kubelet[2675]: E0912 22:11:38.796054 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.796170 kubelet[2675]: W0912 22:11:38.796119 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.796170 kubelet[2675]: E0912 22:11:38.796143 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:38.796305 kubelet[2675]: E0912 22:11:38.796292 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:11:38.796333 kubelet[2675]: W0912 22:11:38.796305 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:11:38.796333 kubelet[2675]: E0912 22:11:38.796316 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:11:39.191979 containerd[1538]: time="2025-09-12T22:11:39.191406818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:39.191979 containerd[1538]: time="2025-09-12T22:11:39.192700163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 22:11:39.193693 containerd[1538]: time="2025-09-12T22:11:39.193660319Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:39.196386 containerd[1538]: time="2025-09-12T22:11:39.196345589Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:39.196886 containerd[1538]: time="2025-09-12T22:11:39.196839810Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.237545376s" Sep 12 22:11:39.196954 containerd[1538]: time="2025-09-12T22:11:39.196888460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 22:11:39.199625 containerd[1538]: time="2025-09-12T22:11:39.199557047Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:11:39.215120 containerd[1538]: time="2025-09-12T22:11:39.215077625Z" level=info msg="Container 7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:39.227141 containerd[1538]: time="2025-09-12T22:11:39.227094566Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\"" Sep 12 22:11:39.229198 containerd[1538]: time="2025-09-12T22:11:39.227625635Z" level=info msg="StartContainer for \"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\"" Sep 12 22:11:39.229198 containerd[1538]: time="2025-09-12T22:11:39.228956787Z" level=info msg="connecting to shim 7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb" address="unix:///run/containerd/s/c3d35a362611aa91e8a40054945e010fc4cce256bbca298288d9c0759f66cc66" protocol=ttrpc version=3 Sep 12 22:11:39.259161 systemd[1]: Started cri-containerd-7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb.scope - libcontainer container 7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb. Sep 12 22:11:39.296036 containerd[1538]: time="2025-09-12T22:11:39.295899776Z" level=info msg="StartContainer for \"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\" returns successfully" Sep 12 22:11:39.310319 systemd[1]: cri-containerd-7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb.scope: Deactivated successfully. Sep 12 22:11:39.310969 systemd[1]: cri-containerd-7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb.scope: Consumed 30ms CPU time, 6.3M memory peak, 4.5M written to disk. Sep 12 22:11:39.337491 containerd[1538]: time="2025-09-12T22:11:39.337308136Z" level=info msg="received exit event container_id:\"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\" id:\"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\" pid:3374 exited_at:{seconds:1757715099 nanos:331534154}" Sep 12 22:11:39.337491 containerd[1538]: time="2025-09-12T22:11:39.337374150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\" id:\"7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb\" pid:3374 exited_at:{seconds:1757715099 nanos:331534154}" Sep 12 22:11:39.363671 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7298f568fd1250d03998fd919dbd3f78972113956eae847e620b286302c2e5bb-rootfs.mount: Deactivated successfully. Sep 12 22:11:39.648876 kubelet[2675]: E0912 22:11:39.648828 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xhvwr" podUID="2f3109b1-2f5e-4b92-ad32-113a5ade0713" Sep 12 22:11:39.720417 kubelet[2675]: I0912 22:11:39.720168 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:11:39.722103 containerd[1538]: time="2025-09-12T22:11:39.722070770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:11:41.649441 kubelet[2675]: E0912 22:11:41.649395 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xhvwr" podUID="2f3109b1-2f5e-4b92-ad32-113a5ade0713" Sep 12 22:11:42.910965 containerd[1538]: time="2025-09-12T22:11:42.910904557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:42.911778 containerd[1538]: time="2025-09-12T22:11:42.911733666Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 22:11:42.912372 containerd[1538]: time="2025-09-12T22:11:42.912329094Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:42.914359 containerd[1538]: time="2025-09-12T22:11:42.914330535Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:42.915472 containerd[1538]: time="2025-09-12T22:11:42.915435334Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.193329438s" Sep 12 22:11:42.915472 containerd[1538]: time="2025-09-12T22:11:42.915470821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 22:11:42.917518 containerd[1538]: time="2025-09-12T22:11:42.917489225Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:11:42.926459 containerd[1538]: time="2025-09-12T22:11:42.925893302Z" level=info msg="Container 740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:42.936436 containerd[1538]: time="2025-09-12T22:11:42.936389557Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\"" Sep 12 22:11:42.937065 containerd[1538]: time="2025-09-12T22:11:42.936993426Z" level=info msg="StartContainer for \"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\"" Sep 12 22:11:42.938684 containerd[1538]: time="2025-09-12T22:11:42.938598756Z" level=info msg="connecting to shim 740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37" address="unix:///run/containerd/s/c3d35a362611aa91e8a40054945e010fc4cce256bbca298288d9c0759f66cc66" protocol=ttrpc version=3 Sep 12 22:11:42.966092 systemd[1]: Started cri-containerd-740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37.scope - libcontainer container 740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37. Sep 12 22:11:43.007889 containerd[1538]: time="2025-09-12T22:11:43.007852928Z" level=info msg="StartContainer for \"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\" returns successfully" Sep 12 22:11:43.550227 systemd[1]: cri-containerd-740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37.scope: Deactivated successfully. Sep 12 22:11:43.551273 systemd[1]: cri-containerd-740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37.scope: Consumed 467ms CPU time, 174.7M memory peak, 2.1M read from disk, 165.8M written to disk. Sep 12 22:11:43.553162 containerd[1538]: time="2025-09-12T22:11:43.553123482Z" level=info msg="received exit event container_id:\"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\" id:\"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\" pid:3436 exited_at:{seconds:1757715103 nanos:552885841}" Sep 12 22:11:43.553273 containerd[1538]: time="2025-09-12T22:11:43.553224860Z" level=info msg="TaskExit event in podsandbox handler container_id:\"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\" id:\"740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37\" pid:3436 exited_at:{seconds:1757715103 nanos:552885841}" Sep 12 22:11:43.574358 kubelet[2675]: I0912 22:11:43.574328 2675 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 22:11:43.575359 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-740b8eccacd2b71f371253fdab8eeb8d29c326d50ed0a6fa418706eda0cd4d37-rootfs.mount: Deactivated successfully. Sep 12 22:11:43.696713 systemd[1]: Created slice kubepods-besteffort-pod2f3109b1_2f5e_4b92_ad32_113a5ade0713.slice - libcontainer container kubepods-besteffort-pod2f3109b1_2f5e_4b92_ad32_113a5ade0713.slice. Sep 12 22:11:43.702431 containerd[1538]: time="2025-09-12T22:11:43.702379285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xhvwr,Uid:2f3109b1-2f5e-4b92-ad32-113a5ade0713,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:43.737154 systemd[1]: Created slice kubepods-burstable-pod72d93281_1a6a_4911_bbcd_e5a207b001c1.slice - libcontainer container kubepods-burstable-pod72d93281_1a6a_4911_bbcd_e5a207b001c1.slice. Sep 12 22:11:43.752874 systemd[1]: Created slice kubepods-besteffort-podd3cfc81b_9d51_4ae0_897e_a4defc4863b6.slice - libcontainer container kubepods-besteffort-podd3cfc81b_9d51_4ae0_897e_a4defc4863b6.slice. Sep 12 22:11:43.762404 containerd[1538]: time="2025-09-12T22:11:43.762345683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:11:43.767589 systemd[1]: Created slice kubepods-besteffort-podcb85746f_646b_4dcd_a803_9ab1edd631aa.slice - libcontainer container kubepods-besteffort-podcb85746f_646b_4dcd_a803_9ab1edd631aa.slice. Sep 12 22:11:43.784851 systemd[1]: Created slice kubepods-besteffort-podf1c7757e_adcd_466f_b82d_0cf29ffde430.slice - libcontainer container kubepods-besteffort-podf1c7757e_adcd_466f_b82d_0cf29ffde430.slice. Sep 12 22:11:43.795456 systemd[1]: Created slice kubepods-besteffort-pod442153cd_92fc_4680_bea8_90fcd4c79242.slice - libcontainer container kubepods-besteffort-pod442153cd_92fc_4680_bea8_90fcd4c79242.slice. Sep 12 22:11:43.801416 systemd[1]: Created slice kubepods-besteffort-pod622f24f0_2d6c_4a85_ba15_46c0870bbb8c.slice - libcontainer container kubepods-besteffort-pod622f24f0_2d6c_4a85_ba15_46c0870bbb8c.slice. Sep 12 22:11:43.808162 systemd[1]: Created slice kubepods-burstable-pod632413b7_a3b7_4949_8739_aa313e428d16.slice - libcontainer container kubepods-burstable-pod632413b7_a3b7_4949_8739_aa313e428d16.slice. Sep 12 22:11:43.830242 kubelet[2675]: I0912 22:11:43.830162 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/622f24f0-2d6c-4a85-ba15-46c0870bbb8c-config\") pod \"goldmane-54d579b49d-sh7cj\" (UID: \"622f24f0-2d6c-4a85-ba15-46c0870bbb8c\") " pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:43.830242 kubelet[2675]: I0912 22:11:43.830209 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb85746f-646b-4dcd-a803-9ab1edd631aa-tigera-ca-bundle\") pod \"calico-kube-controllers-58b5c55b9-8kvns\" (UID: \"cb85746f-646b-4dcd-a803-9ab1edd631aa\") " pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" Sep 12 22:11:43.830242 kubelet[2675]: I0912 22:11:43.830248 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jdx8\" (UniqueName: \"kubernetes.io/projected/442153cd-92fc-4680-bea8-90fcd4c79242-kube-api-access-4jdx8\") pod \"whisker-57c949c5dd-nphhn\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " pod="calico-system/whisker-57c949c5dd-nphhn" Sep 12 22:11:43.831042 kubelet[2675]: I0912 22:11:43.830277 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-ca-bundle\") pod \"whisker-57c949c5dd-nphhn\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " pod="calico-system/whisker-57c949c5dd-nphhn" Sep 12 22:11:43.831042 kubelet[2675]: I0912 22:11:43.830294 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vv9g\" (UniqueName: \"kubernetes.io/projected/f1c7757e-adcd-466f-b82d-0cf29ffde430-kube-api-access-4vv9g\") pod \"calico-apiserver-7d8f7d4b54-ml47p\" (UID: \"f1c7757e-adcd-466f-b82d-0cf29ffde430\") " pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" Sep 12 22:11:43.831042 kubelet[2675]: I0912 22:11:43.830315 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7kv\" (UniqueName: \"kubernetes.io/projected/cb85746f-646b-4dcd-a803-9ab1edd631aa-kube-api-access-zm7kv\") pod \"calico-kube-controllers-58b5c55b9-8kvns\" (UID: \"cb85746f-646b-4dcd-a803-9ab1edd631aa\") " pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" Sep 12 22:11:43.831042 kubelet[2675]: I0912 22:11:43.830333 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f1c7757e-adcd-466f-b82d-0cf29ffde430-calico-apiserver-certs\") pod \"calico-apiserver-7d8f7d4b54-ml47p\" (UID: \"f1c7757e-adcd-466f-b82d-0cf29ffde430\") " pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" Sep 12 22:11:43.831042 kubelet[2675]: I0912 22:11:43.830351 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkms\" (UniqueName: \"kubernetes.io/projected/632413b7-a3b7-4949-8739-aa313e428d16-kube-api-access-kdkms\") pod \"coredns-668d6bf9bc-j667g\" (UID: \"632413b7-a3b7-4949-8739-aa313e428d16\") " pod="kube-system/coredns-668d6bf9bc-j667g" Sep 12 22:11:43.831172 kubelet[2675]: I0912 22:11:43.830381 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62swm\" (UniqueName: \"kubernetes.io/projected/622f24f0-2d6c-4a85-ba15-46c0870bbb8c-kube-api-access-62swm\") pod \"goldmane-54d579b49d-sh7cj\" (UID: \"622f24f0-2d6c-4a85-ba15-46c0870bbb8c\") " pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:43.831172 kubelet[2675]: I0912 22:11:43.830399 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-backend-key-pair\") pod \"whisker-57c949c5dd-nphhn\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " pod="calico-system/whisker-57c949c5dd-nphhn" Sep 12 22:11:43.831172 kubelet[2675]: I0912 22:11:43.830416 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632413b7-a3b7-4949-8739-aa313e428d16-config-volume\") pod \"coredns-668d6bf9bc-j667g\" (UID: \"632413b7-a3b7-4949-8739-aa313e428d16\") " pod="kube-system/coredns-668d6bf9bc-j667g" Sep 12 22:11:43.831172 kubelet[2675]: I0912 22:11:43.830435 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72d93281-1a6a-4911-bbcd-e5a207b001c1-config-volume\") pod \"coredns-668d6bf9bc-5bs6c\" (UID: \"72d93281-1a6a-4911-bbcd-e5a207b001c1\") " pod="kube-system/coredns-668d6bf9bc-5bs6c" Sep 12 22:11:43.831172 kubelet[2675]: I0912 22:11:43.830464 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhtjj\" (UniqueName: \"kubernetes.io/projected/72d93281-1a6a-4911-bbcd-e5a207b001c1-kube-api-access-dhtjj\") pod \"coredns-668d6bf9bc-5bs6c\" (UID: \"72d93281-1a6a-4911-bbcd-e5a207b001c1\") " pod="kube-system/coredns-668d6bf9bc-5bs6c" Sep 12 22:11:43.831283 kubelet[2675]: I0912 22:11:43.830495 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdhq\" (UniqueName: \"kubernetes.io/projected/d3cfc81b-9d51-4ae0-897e-a4defc4863b6-kube-api-access-ccdhq\") pod \"calico-apiserver-7d8f7d4b54-d5jhq\" (UID: \"d3cfc81b-9d51-4ae0-897e-a4defc4863b6\") " pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" Sep 12 22:11:43.831283 kubelet[2675]: I0912 22:11:43.830513 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/622f24f0-2d6c-4a85-ba15-46c0870bbb8c-goldmane-key-pair\") pod \"goldmane-54d579b49d-sh7cj\" (UID: \"622f24f0-2d6c-4a85-ba15-46c0870bbb8c\") " pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:43.831283 kubelet[2675]: I0912 22:11:43.830536 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/622f24f0-2d6c-4a85-ba15-46c0870bbb8c-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-sh7cj\" (UID: \"622f24f0-2d6c-4a85-ba15-46c0870bbb8c\") " pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:43.831283 kubelet[2675]: I0912 22:11:43.830558 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d3cfc81b-9d51-4ae0-897e-a4defc4863b6-calico-apiserver-certs\") pod \"calico-apiserver-7d8f7d4b54-d5jhq\" (UID: \"d3cfc81b-9d51-4ae0-897e-a4defc4863b6\") " pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" Sep 12 22:11:43.859061 containerd[1538]: time="2025-09-12T22:11:43.859013287Z" level=error msg="Failed to destroy network for sandbox \"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:43.860678 containerd[1538]: time="2025-09-12T22:11:43.860643209Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xhvwr,Uid:2f3109b1-2f5e-4b92-ad32-113a5ade0713,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:43.862948 kubelet[2675]: E0912 22:11:43.862883 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:43.863031 kubelet[2675]: E0912 22:11:43.862985 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:43.863031 kubelet[2675]: E0912 22:11:43.863025 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xhvwr" Sep 12 22:11:43.863363 kubelet[2675]: E0912 22:11:43.863316 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xhvwr_calico-system(2f3109b1-2f5e-4b92-ad32-113a5ade0713)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xhvwr_calico-system(2f3109b1-2f5e-4b92-ad32-113a5ade0713)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"caf10066d3fd1637421439b8767cfde95eb6e86938f260fa8faf6f9483565d16\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xhvwr" podUID="2f3109b1-2f5e-4b92-ad32-113a5ade0713" Sep 12 22:11:43.927037 systemd[1]: run-netns-cni\x2daf795b42\x2dcfe5\x2dfe9d\x2d0124\x2ddb9fbeba1910.mount: Deactivated successfully. Sep 12 22:11:44.043226 containerd[1538]: time="2025-09-12T22:11:44.043173540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5bs6c,Uid:72d93281-1a6a-4911-bbcd-e5a207b001c1,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:44.062938 containerd[1538]: time="2025-09-12T22:11:44.062128021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-d5jhq,Uid:d3cfc81b-9d51-4ae0-897e-a4defc4863b6,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:11:44.072082 containerd[1538]: time="2025-09-12T22:11:44.072043594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58b5c55b9-8kvns,Uid:cb85746f-646b-4dcd-a803-9ab1edd631aa,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:44.090676 containerd[1538]: time="2025-09-12T22:11:44.090569844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-ml47p,Uid:f1c7757e-adcd-466f-b82d-0cf29ffde430,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:11:44.105522 containerd[1538]: time="2025-09-12T22:11:44.105471569Z" level=error msg="Failed to destroy network for sandbox \"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.106459 containerd[1538]: time="2025-09-12T22:11:44.106430568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57c949c5dd-nphhn,Uid:442153cd-92fc-4680-bea8-90fcd4c79242,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:44.107072 containerd[1538]: time="2025-09-12T22:11:44.107009625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-sh7cj,Uid:622f24f0-2d6c-4a85-ba15-46c0870bbb8c,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:44.111658 containerd[1538]: time="2025-09-12T22:11:44.111415200Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5bs6c,Uid:72d93281-1a6a-4911-bbcd-e5a207b001c1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.112056 kubelet[2675]: E0912 22:11:44.112009 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.112135 kubelet[2675]: E0912 22:11:44.112075 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5bs6c" Sep 12 22:11:44.112181 kubelet[2675]: E0912 22:11:44.112107 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5bs6c" Sep 12 22:11:44.112209 kubelet[2675]: E0912 22:11:44.112175 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-5bs6c_kube-system(72d93281-1a6a-4911-bbcd-e5a207b001c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-5bs6c_kube-system(72d93281-1a6a-4911-bbcd-e5a207b001c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2a08bdf326266c9caa0a3984c1a9e028a0ec804e54c12a490fd15a449b2593c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-5bs6c" podUID="72d93281-1a6a-4911-bbcd-e5a207b001c1" Sep 12 22:11:44.112312 containerd[1538]: time="2025-09-12T22:11:44.112136320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j667g,Uid:632413b7-a3b7-4949-8739-aa313e428d16,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:44.143112 containerd[1538]: time="2025-09-12T22:11:44.143068518Z" level=error msg="Failed to destroy network for sandbox \"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.144254 containerd[1538]: time="2025-09-12T22:11:44.144215349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-d5jhq,Uid:d3cfc81b-9d51-4ae0-897e-a4defc4863b6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.144785 kubelet[2675]: E0912 22:11:44.144734 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.144881 kubelet[2675]: E0912 22:11:44.144804 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" Sep 12 22:11:44.144881 kubelet[2675]: E0912 22:11:44.144824 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" Sep 12 22:11:44.144881 kubelet[2675]: E0912 22:11:44.144863 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d8f7d4b54-d5jhq_calico-apiserver(d3cfc81b-9d51-4ae0-897e-a4defc4863b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d8f7d4b54-d5jhq_calico-apiserver(d3cfc81b-9d51-4ae0-897e-a4defc4863b6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e4f7d859922f96f6feca0ede39d06e87b0b2e1c803f35dda4ad418a4564316d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" podUID="d3cfc81b-9d51-4ae0-897e-a4defc4863b6" Sep 12 22:11:44.165901 containerd[1538]: time="2025-09-12T22:11:44.165854758Z" level=error msg="Failed to destroy network for sandbox \"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.167004 containerd[1538]: time="2025-09-12T22:11:44.166955701Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58b5c55b9-8kvns,Uid:cb85746f-646b-4dcd-a803-9ab1edd631aa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.167284 kubelet[2675]: E0912 22:11:44.167241 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.167345 kubelet[2675]: E0912 22:11:44.167304 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" Sep 12 22:11:44.167345 kubelet[2675]: E0912 22:11:44.167323 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" Sep 12 22:11:44.167696 kubelet[2675]: E0912 22:11:44.167434 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58b5c55b9-8kvns_calico-system(cb85746f-646b-4dcd-a803-9ab1edd631aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58b5c55b9-8kvns_calico-system(cb85746f-646b-4dcd-a803-9ab1edd631aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a7d5fb7c0a601578dcd32ff1a04af09a6b69fe8d7edf3c3aa30b085f40fa7223\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" podUID="cb85746f-646b-4dcd-a803-9ab1edd631aa" Sep 12 22:11:44.186894 containerd[1538]: time="2025-09-12T22:11:44.186768125Z" level=error msg="Failed to destroy network for sandbox \"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.189718 containerd[1538]: time="2025-09-12T22:11:44.189667528Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-ml47p,Uid:f1c7757e-adcd-466f-b82d-0cf29ffde430,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.190442 kubelet[2675]: E0912 22:11:44.189934 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.190442 kubelet[2675]: E0912 22:11:44.190010 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" Sep 12 22:11:44.190442 kubelet[2675]: E0912 22:11:44.190030 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" Sep 12 22:11:44.190571 kubelet[2675]: E0912 22:11:44.190078 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d8f7d4b54-ml47p_calico-apiserver(f1c7757e-adcd-466f-b82d-0cf29ffde430)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d8f7d4b54-ml47p_calico-apiserver(f1c7757e-adcd-466f-b82d-0cf29ffde430)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"199ac5ef756397fbe07e0018831c5f0737ea81a36ef5ed73b1db7a4fb37bf9f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" podUID="f1c7757e-adcd-466f-b82d-0cf29ffde430" Sep 12 22:11:44.198284 containerd[1538]: time="2025-09-12T22:11:44.198230516Z" level=error msg="Failed to destroy network for sandbox \"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.198654 containerd[1538]: time="2025-09-12T22:11:44.198623742Z" level=error msg="Failed to destroy network for sandbox \"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.199636 containerd[1538]: time="2025-09-12T22:11:44.199562618Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-sh7cj,Uid:622f24f0-2d6c-4a85-ba15-46c0870bbb8c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.199799 kubelet[2675]: E0912 22:11:44.199754 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.199872 kubelet[2675]: E0912 22:11:44.199819 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:44.199872 kubelet[2675]: E0912 22:11:44.199852 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-sh7cj" Sep 12 22:11:44.201498 kubelet[2675]: E0912 22:11:44.199899 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-sh7cj_calico-system(622f24f0-2d6c-4a85-ba15-46c0870bbb8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-sh7cj_calico-system(622f24f0-2d6c-4a85-ba15-46c0870bbb8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"576a89de155e65a22617ed0bd5bc5ea424227e626a7395a595dc7a3810cce67e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-sh7cj" podUID="622f24f0-2d6c-4a85-ba15-46c0870bbb8c" Sep 12 22:11:44.201933 containerd[1538]: time="2025-09-12T22:11:44.201870363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57c949c5dd-nphhn,Uid:442153cd-92fc-4680-bea8-90fcd4c79242,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.202148 kubelet[2675]: E0912 22:11:44.202119 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.202190 kubelet[2675]: E0912 22:11:44.202166 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57c949c5dd-nphhn" Sep 12 22:11:44.202214 kubelet[2675]: E0912 22:11:44.202185 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57c949c5dd-nphhn" Sep 12 22:11:44.202248 kubelet[2675]: E0912 22:11:44.202226 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-57c949c5dd-nphhn_calico-system(442153cd-92fc-4680-bea8-90fcd4c79242)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-57c949c5dd-nphhn_calico-system(442153cd-92fc-4680-bea8-90fcd4c79242)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b28955da681627239dea1a52089b58130ec0799c6eefe466b149ac1135e73b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-57c949c5dd-nphhn" podUID="442153cd-92fc-4680-bea8-90fcd4c79242" Sep 12 22:11:44.210222 containerd[1538]: time="2025-09-12T22:11:44.210181989Z" level=error msg="Failed to destroy network for sandbox \"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.211311 containerd[1538]: time="2025-09-12T22:11:44.211212241Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j667g,Uid:632413b7-a3b7-4949-8739-aa313e428d16,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.211464 kubelet[2675]: E0912 22:11:44.211413 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:11:44.211509 kubelet[2675]: E0912 22:11:44.211467 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j667g" Sep 12 22:11:44.211509 kubelet[2675]: E0912 22:11:44.211499 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j667g" Sep 12 22:11:44.211577 kubelet[2675]: E0912 22:11:44.211545 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-j667g_kube-system(632413b7-a3b7-4949-8739-aa313e428d16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-j667g_kube-system(632413b7-a3b7-4949-8739-aa313e428d16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80db45dffb759b7b6817393da1f070b196a5ae3b35a93af0c4949382373750d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-j667g" podUID="632413b7-a3b7-4949-8739-aa313e428d16" Sep 12 22:11:47.752422 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3815346998.mount: Deactivated successfully. Sep 12 22:11:48.088717 containerd[1538]: time="2025-09-12T22:11:48.088384218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:48.111232 containerd[1538]: time="2025-09-12T22:11:48.089452732Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 22:11:48.111232 containerd[1538]: time="2025-09-12T22:11:48.090385987Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:48.111405 containerd[1538]: time="2025-09-12T22:11:48.092878386Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.330472132s" Sep 12 22:11:48.111405 containerd[1538]: time="2025-09-12T22:11:48.111313681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 22:11:48.112068 containerd[1538]: time="2025-09-12T22:11:48.112039585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:48.129328 containerd[1538]: time="2025-09-12T22:11:48.129274948Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:11:48.164965 containerd[1538]: time="2025-09-12T22:11:48.164105804Z" level=info msg="Container c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:48.166664 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4215001856.mount: Deactivated successfully. Sep 12 22:11:48.173307 containerd[1538]: time="2025-09-12T22:11:48.173256202Z" level=info msg="CreateContainer within sandbox \"2045f48ed5ac62ff779a01aae4f63f0f8dc1ca135c2cab2096338430372c987b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\"" Sep 12 22:11:48.174704 containerd[1538]: time="2025-09-12T22:11:48.174651923Z" level=info msg="StartContainer for \"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\"" Sep 12 22:11:48.176507 containerd[1538]: time="2025-09-12T22:11:48.176471305Z" level=info msg="connecting to shim c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152" address="unix:///run/containerd/s/c3d35a362611aa91e8a40054945e010fc4cce256bbca298288d9c0759f66cc66" protocol=ttrpc version=3 Sep 12 22:11:48.203147 systemd[1]: Started cri-containerd-c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152.scope - libcontainer container c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152. Sep 12 22:11:48.243200 containerd[1538]: time="2025-09-12T22:11:48.243071257Z" level=info msg="StartContainer for \"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\" returns successfully" Sep 12 22:11:48.367728 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:11:48.367872 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:11:48.564002 kubelet[2675]: I0912 22:11:48.563450 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-ca-bundle\") pod \"442153cd-92fc-4680-bea8-90fcd4c79242\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " Sep 12 22:11:48.564002 kubelet[2675]: I0912 22:11:48.563517 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-backend-key-pair\") pod \"442153cd-92fc-4680-bea8-90fcd4c79242\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " Sep 12 22:11:48.564002 kubelet[2675]: I0912 22:11:48.563563 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jdx8\" (UniqueName: \"kubernetes.io/projected/442153cd-92fc-4680-bea8-90fcd4c79242-kube-api-access-4jdx8\") pod \"442153cd-92fc-4680-bea8-90fcd4c79242\" (UID: \"442153cd-92fc-4680-bea8-90fcd4c79242\") " Sep 12 22:11:48.571616 kubelet[2675]: I0912 22:11:48.571529 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/442153cd-92fc-4680-bea8-90fcd4c79242-kube-api-access-4jdx8" (OuterVolumeSpecName: "kube-api-access-4jdx8") pod "442153cd-92fc-4680-bea8-90fcd4c79242" (UID: "442153cd-92fc-4680-bea8-90fcd4c79242"). InnerVolumeSpecName "kube-api-access-4jdx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 22:11:48.572266 kubelet[2675]: I0912 22:11:48.572242 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "442153cd-92fc-4680-bea8-90fcd4c79242" (UID: "442153cd-92fc-4680-bea8-90fcd4c79242"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 22:11:48.572888 kubelet[2675]: I0912 22:11:48.572855 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "442153cd-92fc-4680-bea8-90fcd4c79242" (UID: "442153cd-92fc-4680-bea8-90fcd4c79242"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 22:11:48.660227 systemd[1]: Removed slice kubepods-besteffort-pod442153cd_92fc_4680_bea8_90fcd4c79242.slice - libcontainer container kubepods-besteffort-pod442153cd_92fc_4680_bea8_90fcd4c79242.slice. Sep 12 22:11:48.664944 kubelet[2675]: I0912 22:11:48.664549 2675 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 22:11:48.664944 kubelet[2675]: I0912 22:11:48.664579 2675 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jdx8\" (UniqueName: \"kubernetes.io/projected/442153cd-92fc-4680-bea8-90fcd4c79242-kube-api-access-4jdx8\") on node \"localhost\" DevicePath \"\"" Sep 12 22:11:48.664944 kubelet[2675]: I0912 22:11:48.664589 2675 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/442153cd-92fc-4680-bea8-90fcd4c79242-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 22:11:48.751877 systemd[1]: var-lib-kubelet-pods-442153cd\x2d92fc\x2d4680\x2dbea8\x2d90fcd4c79242-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4jdx8.mount: Deactivated successfully. Sep 12 22:11:48.751990 systemd[1]: var-lib-kubelet-pods-442153cd\x2d92fc\x2d4680\x2dbea8\x2d90fcd4c79242-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:11:48.820960 kubelet[2675]: I0912 22:11:48.820695 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8d8ll" podStartSLOduration=2.078599962 podStartE2EDuration="13.820679929s" podCreationTimestamp="2025-09-12 22:11:35 +0000 UTC" firstStartedPulling="2025-09-12 22:11:36.373803772 +0000 UTC m=+19.827816775" lastFinishedPulling="2025-09-12 22:11:48.115883739 +0000 UTC m=+31.569896742" observedRunningTime="2025-09-12 22:11:48.799056974 +0000 UTC m=+32.253070017" watchObservedRunningTime="2025-09-12 22:11:48.820679929 +0000 UTC m=+32.274692932" Sep 12 22:11:48.880881 systemd[1]: Created slice kubepods-besteffort-pod8f3d13cb_3c65_4cb7_96c4_4918fd52d8c5.slice - libcontainer container kubepods-besteffort-pod8f3d13cb_3c65_4cb7_96c4_4918fd52d8c5.slice. Sep 12 22:11:48.927935 containerd[1538]: time="2025-09-12T22:11:48.927787915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\" id:\"1b575622ba80fdc115745e5d325f6d9925dd4340f9a864eabf1efb1436090eb3\" pid:3829 exit_status:1 exited_at:{seconds:1757715108 nanos:927345731}" Sep 12 22:11:48.967187 kubelet[2675]: I0912 22:11:48.967140 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5-whisker-backend-key-pair\") pod \"whisker-58449d5d87-fd7pm\" (UID: \"8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5\") " pod="calico-system/whisker-58449d5d87-fd7pm" Sep 12 22:11:48.967187 kubelet[2675]: I0912 22:11:48.967186 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5-whisker-ca-bundle\") pod \"whisker-58449d5d87-fd7pm\" (UID: \"8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5\") " pod="calico-system/whisker-58449d5d87-fd7pm" Sep 12 22:11:48.967370 kubelet[2675]: I0912 22:11:48.967214 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rgf\" (UniqueName: \"kubernetes.io/projected/8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5-kube-api-access-p6rgf\") pod \"whisker-58449d5d87-fd7pm\" (UID: \"8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5\") " pod="calico-system/whisker-58449d5d87-fd7pm" Sep 12 22:11:49.185540 containerd[1538]: time="2025-09-12T22:11:49.185477257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58449d5d87-fd7pm,Uid:8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:49.358501 systemd-networkd[1442]: cali3c2702f9f53: Link UP Sep 12 22:11:49.359951 systemd-networkd[1442]: cali3c2702f9f53: Gained carrier Sep 12 22:11:49.374531 containerd[1538]: 2025-09-12 22:11:49.211 [INFO][3844] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:11:49.374531 containerd[1538]: 2025-09-12 22:11:49.243 [INFO][3844] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--58449d5d87--fd7pm-eth0 whisker-58449d5d87- calico-system 8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5 846 0 2025-09-12 22:11:48 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58449d5d87 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-58449d5d87-fd7pm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3c2702f9f53 [] [] }} ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-" Sep 12 22:11:49.374531 containerd[1538]: 2025-09-12 22:11:49.243 [INFO][3844] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.374531 containerd[1538]: 2025-09-12 22:11:49.312 [INFO][3860] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" HandleID="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Workload="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.312 [INFO][3860] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" HandleID="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Workload="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c440), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-58449d5d87-fd7pm", "timestamp":"2025-09-12 22:11:49.312332156 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.312 [INFO][3860] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.312 [INFO][3860] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.312 [INFO][3860] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.324 [INFO][3860] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" host="localhost" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.329 [INFO][3860] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.334 [INFO][3860] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.335 [INFO][3860] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.338 [INFO][3860] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:49.374766 containerd[1538]: 2025-09-12 22:11:49.338 [INFO][3860] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" host="localhost" Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.339 [INFO][3860] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.343 [INFO][3860] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" host="localhost" Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.348 [INFO][3860] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" host="localhost" Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.348 [INFO][3860] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" host="localhost" Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.348 [INFO][3860] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:49.375066 containerd[1538]: 2025-09-12 22:11:49.348 [INFO][3860] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" HandleID="k8s-pod-network.b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Workload="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.375180 containerd[1538]: 2025-09-12 22:11:49.350 [INFO][3844] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--58449d5d87--fd7pm-eth0", GenerateName:"whisker-58449d5d87-", Namespace:"calico-system", SelfLink:"", UID:"8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58449d5d87", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-58449d5d87-fd7pm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3c2702f9f53", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:49.375180 containerd[1538]: 2025-09-12 22:11:49.351 [INFO][3844] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.375248 containerd[1538]: 2025-09-12 22:11:49.351 [INFO][3844] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3c2702f9f53 ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.375248 containerd[1538]: 2025-09-12 22:11:49.359 [INFO][3844] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.375287 containerd[1538]: 2025-09-12 22:11:49.360 [INFO][3844] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--58449d5d87--fd7pm-eth0", GenerateName:"whisker-58449d5d87-", Namespace:"calico-system", SelfLink:"", UID:"8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58449d5d87", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf", Pod:"whisker-58449d5d87-fd7pm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3c2702f9f53", MAC:"2e:a4:37:c6:6e:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:49.375331 containerd[1538]: 2025-09-12 22:11:49.371 [INFO][3844] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" Namespace="calico-system" Pod="whisker-58449d5d87-fd7pm" WorkloadEndpoint="localhost-k8s-whisker--58449d5d87--fd7pm-eth0" Sep 12 22:11:49.425933 containerd[1538]: time="2025-09-12T22:11:49.425877922Z" level=info msg="connecting to shim b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf" address="unix:///run/containerd/s/3403d2efbe70f5e74577bb8305d27dc8123c5d760afcc1170e302f71bf6a1cac" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:49.450090 systemd[1]: Started cri-containerd-b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf.scope - libcontainer container b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf. Sep 12 22:11:49.460924 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:49.506853 containerd[1538]: time="2025-09-12T22:11:49.506797186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58449d5d87-fd7pm,Uid:8f3d13cb-3c65-4cb7-96c4-4918fd52d8c5,Namespace:calico-system,Attempt:0,} returns sandbox id \"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf\"" Sep 12 22:11:49.521895 containerd[1538]: time="2025-09-12T22:11:49.521857483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:11:49.939063 containerd[1538]: time="2025-09-12T22:11:49.939020154Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\" id:\"5f1a102a7737f51616ab30ef982a39f8ef47a5f9dbfb86831f7d17bc601afc7e\" pid:4027 exit_status:1 exited_at:{seconds:1757715109 nanos:938716431}" Sep 12 22:11:50.510869 containerd[1538]: time="2025-09-12T22:11:50.510808684Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:50.511527 containerd[1538]: time="2025-09-12T22:11:50.511496657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 22:11:50.511999 containerd[1538]: time="2025-09-12T22:11:50.511973641Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:50.514224 containerd[1538]: time="2025-09-12T22:11:50.514194660Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:50.515115 containerd[1538]: time="2025-09-12T22:11:50.515071499Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 993.175571ms" Sep 12 22:11:50.515115 containerd[1538]: time="2025-09-12T22:11:50.515109584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 22:11:50.517715 containerd[1538]: time="2025-09-12T22:11:50.517581597Z" level=info msg="CreateContainer within sandbox \"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:11:50.529930 containerd[1538]: time="2025-09-12T22:11:50.529883133Z" level=info msg="Container 08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:50.541363 containerd[1538]: time="2025-09-12T22:11:50.541231302Z" level=info msg="CreateContainer within sandbox \"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec\"" Sep 12 22:11:50.542959 containerd[1538]: time="2025-09-12T22:11:50.541807899Z" level=info msg="StartContainer for \"08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec\"" Sep 12 22:11:50.543073 containerd[1538]: time="2025-09-12T22:11:50.543048427Z" level=info msg="connecting to shim 08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec" address="unix:///run/containerd/s/3403d2efbe70f5e74577bb8305d27dc8123c5d760afcc1170e302f71bf6a1cac" protocol=ttrpc version=3 Sep 12 22:11:50.577132 systemd[1]: Started cri-containerd-08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec.scope - libcontainer container 08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec. Sep 12 22:11:50.613535 containerd[1538]: time="2025-09-12T22:11:50.613487074Z" level=info msg="StartContainer for \"08632ef8a9621d23e92361c624e51b9d98fa1d9eead01eff165fae4a75f186ec\" returns successfully" Sep 12 22:11:50.614799 containerd[1538]: time="2025-09-12T22:11:50.614750444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:11:50.652508 kubelet[2675]: I0912 22:11:50.652449 2675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="442153cd-92fc-4680-bea8-90fcd4c79242" path="/var/lib/kubelet/pods/442153cd-92fc-4680-bea8-90fcd4c79242/volumes" Sep 12 22:11:50.750089 systemd-networkd[1442]: cali3c2702f9f53: Gained IPv6LL Sep 12 22:11:52.322984 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2192054499.mount: Deactivated successfully. Sep 12 22:11:52.336382 containerd[1538]: time="2025-09-12T22:11:52.336339562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:52.337374 containerd[1538]: time="2025-09-12T22:11:52.337257278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 22:11:52.338127 containerd[1538]: time="2025-09-12T22:11:52.338093824Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:52.340546 containerd[1538]: time="2025-09-12T22:11:52.340520851Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:52.341453 containerd[1538]: time="2025-09-12T22:11:52.341133208Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.726331957s" Sep 12 22:11:52.341453 containerd[1538]: time="2025-09-12T22:11:52.341181054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 22:11:52.343405 containerd[1538]: time="2025-09-12T22:11:52.343376012Z" level=info msg="CreateContainer within sandbox \"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:11:52.397588 containerd[1538]: time="2025-09-12T22:11:52.389220850Z" level=info msg="Container 6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:52.403883 containerd[1538]: time="2025-09-12T22:11:52.403830618Z" level=info msg="CreateContainer within sandbox \"b860c6b40b06c63ef1157b39e5d9aedcb9949aef146eede9c025cbec07d3e7bf\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791\"" Sep 12 22:11:52.404812 containerd[1538]: time="2025-09-12T22:11:52.404676525Z" level=info msg="StartContainer for \"6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791\"" Sep 12 22:11:52.408162 containerd[1538]: time="2025-09-12T22:11:52.408037230Z" level=info msg="connecting to shim 6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791" address="unix:///run/containerd/s/3403d2efbe70f5e74577bb8305d27dc8123c5d760afcc1170e302f71bf6a1cac" protocol=ttrpc version=3 Sep 12 22:11:52.434080 systemd[1]: Started cri-containerd-6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791.scope - libcontainer container 6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791. Sep 12 22:11:52.471946 containerd[1538]: time="2025-09-12T22:11:52.471896906Z" level=info msg="StartContainer for \"6390d601fc0ca72e7dd3358b4cb0c49bdf160230cab67862e2716cd10c02e791\" returns successfully" Sep 12 22:11:52.814059 kubelet[2675]: I0912 22:11:52.813987 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-58449d5d87-fd7pm" podStartSLOduration=1.9935164890000001 podStartE2EDuration="4.813967609s" podCreationTimestamp="2025-09-12 22:11:48 +0000 UTC" firstStartedPulling="2025-09-12 22:11:49.52140786 +0000 UTC m=+32.975420863" lastFinishedPulling="2025-09-12 22:11:52.34185898 +0000 UTC m=+35.795871983" observedRunningTime="2025-09-12 22:11:52.812828025 +0000 UTC m=+36.266841028" watchObservedRunningTime="2025-09-12 22:11:52.813967609 +0000 UTC m=+36.267980612" Sep 12 22:11:53.599523 kubelet[2675]: I0912 22:11:53.599471 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:11:54.280400 systemd-networkd[1442]: vxlan.calico: Link UP Sep 12 22:11:54.280416 systemd-networkd[1442]: vxlan.calico: Gained carrier Sep 12 22:11:54.650346 containerd[1538]: time="2025-09-12T22:11:54.650282091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xhvwr,Uid:2f3109b1-2f5e-4b92-ad32-113a5ade0713,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:54.778138 systemd-networkd[1442]: cali812a276d43d: Link UP Sep 12 22:11:54.778718 systemd-networkd[1442]: cali812a276d43d: Gained carrier Sep 12 22:11:54.794241 containerd[1538]: 2025-09-12 22:11:54.713 [INFO][4341] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--xhvwr-eth0 csi-node-driver- calico-system 2f3109b1-2f5e-4b92-ad32-113a5ade0713 641 0 2025-09-12 22:11:36 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-xhvwr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali812a276d43d [] [] }} ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-" Sep 12 22:11:54.794241 containerd[1538]: 2025-09-12 22:11:54.715 [INFO][4341] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.794241 containerd[1538]: 2025-09-12 22:11:54.741 [INFO][4355] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" HandleID="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Workload="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.741 [INFO][4355] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" HandleID="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Workload="localhost-k8s-csi--node--driver--xhvwr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000322140), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-xhvwr", "timestamp":"2025-09-12 22:11:54.741706794 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.741 [INFO][4355] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.742 [INFO][4355] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.742 [INFO][4355] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.751 [INFO][4355] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" host="localhost" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.755 [INFO][4355] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.759 [INFO][4355] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.761 [INFO][4355] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.763 [INFO][4355] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:54.794463 containerd[1538]: 2025-09-12 22:11:54.763 [INFO][4355] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" host="localhost" Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.764 [INFO][4355] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0 Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.768 [INFO][4355] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" host="localhost" Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.773 [INFO][4355] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" host="localhost" Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.773 [INFO][4355] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" host="localhost" Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.773 [INFO][4355] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:54.795593 containerd[1538]: 2025-09-12 22:11:54.773 [INFO][4355] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" HandleID="k8s-pod-network.47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Workload="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.795860 containerd[1538]: 2025-09-12 22:11:54.775 [INFO][4341] cni-plugin/k8s.go 418: Populated endpoint ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xhvwr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2f3109b1-2f5e-4b92-ad32-113a5ade0713", ResourceVersion:"641", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-xhvwr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali812a276d43d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:54.796039 containerd[1538]: 2025-09-12 22:11:54.775 [INFO][4341] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.796039 containerd[1538]: 2025-09-12 22:11:54.775 [INFO][4341] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali812a276d43d ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.796039 containerd[1538]: 2025-09-12 22:11:54.778 [INFO][4341] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.796200 containerd[1538]: 2025-09-12 22:11:54.779 [INFO][4341] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xhvwr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2f3109b1-2f5e-4b92-ad32-113a5ade0713", ResourceVersion:"641", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0", Pod:"csi-node-driver-xhvwr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali812a276d43d", MAC:"b2:87:07:3b:6d:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:54.796286 containerd[1538]: 2025-09-12 22:11:54.791 [INFO][4341] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" Namespace="calico-system" Pod="csi-node-driver-xhvwr" WorkloadEndpoint="localhost-k8s-csi--node--driver--xhvwr-eth0" Sep 12 22:11:54.816102 containerd[1538]: time="2025-09-12T22:11:54.816056860Z" level=info msg="connecting to shim 47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0" address="unix:///run/containerd/s/c979d7e9afbfa1f32580e048b659a7a9b2034da6e6721116dced2560786ffcfe" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:54.847068 systemd[1]: Started cri-containerd-47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0.scope - libcontainer container 47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0. Sep 12 22:11:54.857304 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:54.869281 containerd[1538]: time="2025-09-12T22:11:54.869233202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xhvwr,Uid:2f3109b1-2f5e-4b92-ad32-113a5ade0713,Namespace:calico-system,Attempt:0,} returns sandbox id \"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0\"" Sep 12 22:11:54.870953 containerd[1538]: time="2025-09-12T22:11:54.870788707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:11:55.650172 containerd[1538]: time="2025-09-12T22:11:55.650131125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58b5c55b9-8kvns,Uid:cb85746f-646b-4dcd-a803-9ab1edd631aa,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:55.787385 systemd-networkd[1442]: calie2baf3d3aa2: Link UP Sep 12 22:11:55.788046 systemd-networkd[1442]: calie2baf3d3aa2: Gained carrier Sep 12 22:11:55.817410 containerd[1538]: 2025-09-12 22:11:55.700 [INFO][4422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0 calico-kube-controllers-58b5c55b9- calico-system cb85746f-646b-4dcd-a803-9ab1edd631aa 783 0 2025-09-12 22:11:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58b5c55b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-58b5c55b9-8kvns eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie2baf3d3aa2 [] [] }} ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-" Sep 12 22:11:55.817410 containerd[1538]: 2025-09-12 22:11:55.700 [INFO][4422] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.817410 containerd[1538]: 2025-09-12 22:11:55.728 [INFO][4437] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" HandleID="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Workload="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.728 [INFO][4437] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" HandleID="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Workload="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137740), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-58b5c55b9-8kvns", "timestamp":"2025-09-12 22:11:55.728043961 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.728 [INFO][4437] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.728 [INFO][4437] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.728 [INFO][4437] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.740 [INFO][4437] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" host="localhost" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.747 [INFO][4437] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.757 [INFO][4437] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.760 [INFO][4437] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.763 [INFO][4437] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:55.817799 containerd[1538]: 2025-09-12 22:11:55.763 [INFO][4437] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" host="localhost" Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.765 [INFO][4437] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.770 [INFO][4437] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" host="localhost" Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.780 [INFO][4437] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" host="localhost" Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.780 [INFO][4437] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" host="localhost" Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.780 [INFO][4437] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:55.818070 containerd[1538]: 2025-09-12 22:11:55.781 [INFO][4437] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" HandleID="k8s-pod-network.c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Workload="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.818185 containerd[1538]: 2025-09-12 22:11:55.785 [INFO][4422] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0", GenerateName:"calico-kube-controllers-58b5c55b9-", Namespace:"calico-system", SelfLink:"", UID:"cb85746f-646b-4dcd-a803-9ab1edd631aa", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58b5c55b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-58b5c55b9-8kvns", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie2baf3d3aa2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:55.818238 containerd[1538]: 2025-09-12 22:11:55.785 [INFO][4422] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.818238 containerd[1538]: 2025-09-12 22:11:55.785 [INFO][4422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie2baf3d3aa2 ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.818238 containerd[1538]: 2025-09-12 22:11:55.787 [INFO][4422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.818295 containerd[1538]: 2025-09-12 22:11:55.788 [INFO][4422] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0", GenerateName:"calico-kube-controllers-58b5c55b9-", Namespace:"calico-system", SelfLink:"", UID:"cb85746f-646b-4dcd-a803-9ab1edd631aa", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58b5c55b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a", Pod:"calico-kube-controllers-58b5c55b9-8kvns", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie2baf3d3aa2", MAC:"3e:98:2f:fb:2b:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:55.818404 containerd[1538]: 2025-09-12 22:11:55.804 [INFO][4422] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" Namespace="calico-system" Pod="calico-kube-controllers-58b5c55b9-8kvns" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--58b5c55b9--8kvns-eth0" Sep 12 22:11:55.850716 containerd[1538]: time="2025-09-12T22:11:55.850661862Z" level=info msg="connecting to shim c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a" address="unix:///run/containerd/s/d4a9b049ad77f52a79ae2a1f35eaed1b266e8c3e0ee271854016f4f50676efe4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:55.879139 systemd[1]: Started cri-containerd-c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a.scope - libcontainer container c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a. Sep 12 22:11:55.896297 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:55.922931 containerd[1538]: time="2025-09-12T22:11:55.922792348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58b5c55b9-8kvns,Uid:cb85746f-646b-4dcd-a803-9ab1edd631aa,Namespace:calico-system,Attempt:0,} returns sandbox id \"c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a\"" Sep 12 22:11:55.961357 containerd[1538]: time="2025-09-12T22:11:55.961306215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:55.961770 containerd[1538]: time="2025-09-12T22:11:55.961717263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 22:11:55.962753 containerd[1538]: time="2025-09-12T22:11:55.962713298Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:55.964613 containerd[1538]: time="2025-09-12T22:11:55.964582635Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:55.965479 containerd[1538]: time="2025-09-12T22:11:55.965078813Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.094231458s" Sep 12 22:11:55.965479 containerd[1538]: time="2025-09-12T22:11:55.965103216Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 22:11:55.967626 containerd[1538]: time="2025-09-12T22:11:55.967596785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:11:55.968405 containerd[1538]: time="2025-09-12T22:11:55.968361473Z" level=info msg="CreateContainer within sandbox \"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:11:55.988159 containerd[1538]: time="2025-09-12T22:11:55.988119005Z" level=info msg="Container 727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:56.001707 containerd[1538]: time="2025-09-12T22:11:56.001667336Z" level=info msg="CreateContainer within sandbox \"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0\"" Sep 12 22:11:56.002302 containerd[1538]: time="2025-09-12T22:11:56.002266085Z" level=info msg="StartContainer for \"727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0\"" Sep 12 22:11:56.003732 containerd[1538]: time="2025-09-12T22:11:56.003690045Z" level=info msg="connecting to shim 727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0" address="unix:///run/containerd/s/c979d7e9afbfa1f32580e048b659a7a9b2034da6e6721116dced2560786ffcfe" protocol=ttrpc version=3 Sep 12 22:11:56.033090 systemd[1]: Started cri-containerd-727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0.scope - libcontainer container 727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0. Sep 12 22:11:56.063014 systemd-networkd[1442]: cali812a276d43d: Gained IPv6LL Sep 12 22:11:56.064989 containerd[1538]: time="2025-09-12T22:11:56.064955683Z" level=info msg="StartContainer for \"727e4a47c08edbf793d54fb172b957353b46daf6e4e81db204f1a71d46f4e0d0\" returns successfully" Sep 12 22:11:56.318049 systemd-networkd[1442]: vxlan.calico: Gained IPv6LL Sep 12 22:11:56.650780 containerd[1538]: time="2025-09-12T22:11:56.650531761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5bs6c,Uid:72d93281-1a6a-4911-bbcd-e5a207b001c1,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:56.650780 containerd[1538]: time="2025-09-12T22:11:56.650531721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-sh7cj,Uid:622f24f0-2d6c-4a85-ba15-46c0870bbb8c,Namespace:calico-system,Attempt:0,}" Sep 12 22:11:56.774690 systemd-networkd[1442]: calia7491bbdebe: Link UP Sep 12 22:11:56.775127 systemd-networkd[1442]: calia7491bbdebe: Gained carrier Sep 12 22:11:56.793161 containerd[1538]: 2025-09-12 22:11:56.695 [INFO][4537] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0 coredns-668d6bf9bc- kube-system 72d93281-1a6a-4911-bbcd-e5a207b001c1 777 0 2025-09-12 22:11:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-5bs6c eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia7491bbdebe [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-" Sep 12 22:11:56.793161 containerd[1538]: 2025-09-12 22:11:56.695 [INFO][4537] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.793161 containerd[1538]: 2025-09-12 22:11:56.731 [INFO][4569] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" HandleID="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Workload="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.732 [INFO][4569] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" HandleID="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Workload="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aad80), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-5bs6c", "timestamp":"2025-09-12 22:11:56.73190523 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.732 [INFO][4569] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.732 [INFO][4569] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.732 [INFO][4569] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.743 [INFO][4569] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" host="localhost" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.748 [INFO][4569] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.752 [INFO][4569] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.754 [INFO][4569] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.757 [INFO][4569] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:56.793353 containerd[1538]: 2025-09-12 22:11:56.757 [INFO][4569] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" host="localhost" Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.758 [INFO][4569] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2 Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.762 [INFO][4569] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" host="localhost" Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4569] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" host="localhost" Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4569] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" host="localhost" Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4569] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:56.793570 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4569] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" HandleID="k8s-pod-network.531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Workload="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.793679 containerd[1538]: 2025-09-12 22:11:56.771 [INFO][4537] cni-plugin/k8s.go 418: Populated endpoint ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"72d93281-1a6a-4911-bbcd-e5a207b001c1", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-5bs6c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia7491bbdebe", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:56.794121 containerd[1538]: 2025-09-12 22:11:56.771 [INFO][4537] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.794121 containerd[1538]: 2025-09-12 22:11:56.771 [INFO][4537] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7491bbdebe ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.794121 containerd[1538]: 2025-09-12 22:11:56.777 [INFO][4537] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.794696 containerd[1538]: 2025-09-12 22:11:56.778 [INFO][4537] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"72d93281-1a6a-4911-bbcd-e5a207b001c1", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2", Pod:"coredns-668d6bf9bc-5bs6c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia7491bbdebe", MAC:"02:20:05:10:28:31", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:56.794696 containerd[1538]: 2025-09-12 22:11:56.789 [INFO][4537] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" Namespace="kube-system" Pod="coredns-668d6bf9bc-5bs6c" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--5bs6c-eth0" Sep 12 22:11:56.814964 containerd[1538]: time="2025-09-12T22:11:56.813784275Z" level=info msg="connecting to shim 531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2" address="unix:///run/containerd/s/dc893bd5781e712a7ab35521dc52f76f108db69f3cd1a7592325386cc5bc4e5e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:56.841084 systemd[1]: Started cri-containerd-531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2.scope - libcontainer container 531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2. Sep 12 22:11:56.869664 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:56.889735 systemd-networkd[1442]: cali32a7e9ad97f: Link UP Sep 12 22:11:56.890369 systemd-networkd[1442]: cali32a7e9ad97f: Gained carrier Sep 12 22:11:56.917720 containerd[1538]: time="2025-09-12T22:11:56.917598236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5bs6c,Uid:72d93281-1a6a-4911-bbcd-e5a207b001c1,Namespace:kube-system,Attempt:0,} returns sandbox id \"531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2\"" Sep 12 22:11:56.921571 containerd[1538]: time="2025-09-12T22:11:56.921533321Z" level=info msg="CreateContainer within sandbox \"531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.696 [INFO][4549] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--sh7cj-eth0 goldmane-54d579b49d- calico-system 622f24f0-2d6c-4a85-ba15-46c0870bbb8c 780 0 2025-09-12 22:11:36 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-sh7cj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali32a7e9ad97f [] [] }} ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.696 [INFO][4549] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.733 [INFO][4567] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" HandleID="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Workload="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.733 [INFO][4567] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" HandleID="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Workload="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dcfc0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-sh7cj", "timestamp":"2025-09-12 22:11:56.733724235 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.733 [INFO][4567] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4567] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.768 [INFO][4567] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.843 [INFO][4567] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.858 [INFO][4567] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.863 [INFO][4567] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.865 [INFO][4567] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.868 [INFO][4567] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.868 [INFO][4567] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.869 [INFO][4567] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91 Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.874 [INFO][4567] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.882 [INFO][4567] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.882 [INFO][4567] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" host="localhost" Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.882 [INFO][4567] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:56.947667 containerd[1538]: 2025-09-12 22:11:56.882 [INFO][4567] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" HandleID="k8s-pod-network.b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Workload="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.887 [INFO][4549] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--sh7cj-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"622f24f0-2d6c-4a85-ba15-46c0870bbb8c", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-sh7cj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali32a7e9ad97f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.887 [INFO][4549] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.887 [INFO][4549] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32a7e9ad97f ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.891 [INFO][4549] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.892 [INFO][4549] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--sh7cj-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"622f24f0-2d6c-4a85-ba15-46c0870bbb8c", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91", Pod:"goldmane-54d579b49d-sh7cj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali32a7e9ad97f", MAC:"7e:12:e8:19:46:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:56.948272 containerd[1538]: 2025-09-12 22:11:56.943 [INFO][4549] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" Namespace="calico-system" Pod="goldmane-54d579b49d-sh7cj" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--sh7cj-eth0" Sep 12 22:11:56.962950 containerd[1538]: time="2025-09-12T22:11:56.961823990Z" level=info msg="Container 7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:56.972491 containerd[1538]: time="2025-09-12T22:11:56.972433268Z" level=info msg="CreateContainer within sandbox \"531e7c146366d20fca24db9a5856bc568dbed7841f9c2057d7a225f2c8b1d9c2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb\"" Sep 12 22:11:56.973097 containerd[1538]: time="2025-09-12T22:11:56.973069420Z" level=info msg="StartContainer for \"7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb\"" Sep 12 22:11:56.974368 containerd[1538]: time="2025-09-12T22:11:56.974338403Z" level=info msg="connecting to shim 7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb" address="unix:///run/containerd/s/dc893bd5781e712a7ab35521dc52f76f108db69f3cd1a7592325386cc5bc4e5e" protocol=ttrpc version=3 Sep 12 22:11:56.983627 containerd[1538]: time="2025-09-12T22:11:56.983582847Z" level=info msg="connecting to shim b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91" address="unix:///run/containerd/s/dda6ceebeaa4e7dfe8cf87f7e1358513eac79aca2364c158de071fbaa514faa6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:57.004093 systemd[1]: Started cri-containerd-7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb.scope - libcontainer container 7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb. Sep 12 22:11:57.018101 systemd[1]: Started cri-containerd-b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91.scope - libcontainer container b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91. Sep 12 22:11:57.037969 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:57.053124 containerd[1538]: time="2025-09-12T22:11:57.053083704Z" level=info msg="StartContainer for \"7229fe58bfad5101b9b5781d43b28e96f0689a792cfd7e6d229d284f45a91acb\" returns successfully" Sep 12 22:11:57.068682 containerd[1538]: time="2025-09-12T22:11:57.068638736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-sh7cj,Uid:622f24f0-2d6c-4a85-ba15-46c0870bbb8c,Namespace:calico-system,Attempt:0,} returns sandbox id \"b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91\"" Sep 12 22:11:57.086064 systemd-networkd[1442]: calie2baf3d3aa2: Gained IPv6LL Sep 12 22:11:57.650685 containerd[1538]: time="2025-09-12T22:11:57.650640296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-ml47p,Uid:f1c7757e-adcd-466f-b82d-0cf29ffde430,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:11:57.784389 systemd-networkd[1442]: calic60454522a0: Link UP Sep 12 22:11:57.785146 systemd-networkd[1442]: calic60454522a0: Gained carrier Sep 12 22:11:57.788945 containerd[1538]: time="2025-09-12T22:11:57.788671524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.789554 containerd[1538]: time="2025-09-12T22:11:57.789516017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 22:11:57.791569 containerd[1538]: time="2025-09-12T22:11:57.791043465Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.797266 containerd[1538]: time="2025-09-12T22:11:57.795947205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.797756 containerd[1538]: time="2025-09-12T22:11:57.797536259Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.829277678s" Sep 12 22:11:57.797756 containerd[1538]: time="2025-09-12T22:11:57.797576584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 22:11:57.802451 containerd[1538]: time="2025-09-12T22:11:57.802417117Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.697 [INFO][4736] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0 calico-apiserver-7d8f7d4b54- calico-apiserver f1c7757e-adcd-466f-b82d-0cf29ffde430 782 0 2025-09-12 22:11:33 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d8f7d4b54 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7d8f7d4b54-ml47p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic60454522a0 [] [] }} ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.698 [INFO][4736] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.732 [INFO][4750] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" HandleID="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.733 [INFO][4750] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" HandleID="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004da00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7d8f7d4b54-ml47p", "timestamp":"2025-09-12 22:11:57.732869344 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.733 [INFO][4750] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.733 [INFO][4750] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.733 [INFO][4750] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.744 [INFO][4750] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.757 [INFO][4750] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.762 [INFO][4750] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.763 [INFO][4750] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.766 [INFO][4750] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.766 [INFO][4750] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.768 [INFO][4750] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.772 [INFO][4750] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.779 [INFO][4750] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.780 [INFO][4750] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" host="localhost" Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.780 [INFO][4750] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:57.803971 containerd[1538]: 2025-09-12 22:11:57.780 [INFO][4750] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" HandleID="k8s-pod-network.cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.782 [INFO][4736] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0", GenerateName:"calico-apiserver-7d8f7d4b54-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1c7757e-adcd-466f-b82d-0cf29ffde430", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f7d4b54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7d8f7d4b54-ml47p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic60454522a0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.782 [INFO][4736] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.782 [INFO][4736] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic60454522a0 ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.785 [INFO][4736] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.786 [INFO][4736] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0", GenerateName:"calico-apiserver-7d8f7d4b54-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1c7757e-adcd-466f-b82d-0cf29ffde430", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f7d4b54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b", Pod:"calico-apiserver-7d8f7d4b54-ml47p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic60454522a0", MAC:"06:50:e7:ed:df:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:57.804960 containerd[1538]: 2025-09-12 22:11:57.800 [INFO][4736] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-ml47p" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--ml47p-eth0" Sep 12 22:11:57.807057 containerd[1538]: time="2025-09-12T22:11:57.807027904Z" level=info msg="CreateContainer within sandbox \"c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:11:57.815122 containerd[1538]: time="2025-09-12T22:11:57.815079110Z" level=info msg="Container d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:57.822661 containerd[1538]: time="2025-09-12T22:11:57.822614939Z" level=info msg="CreateContainer within sandbox \"c61f38c28f841de29f33def5f00cc0ef8146d1c58162caba095173e8d8abd71a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71\"" Sep 12 22:11:57.823095 containerd[1538]: time="2025-09-12T22:11:57.823075190Z" level=info msg="StartContainer for \"d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71\"" Sep 12 22:11:57.824638 containerd[1538]: time="2025-09-12T22:11:57.824585116Z" level=info msg="connecting to shim d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71" address="unix:///run/containerd/s/d4a9b049ad77f52a79ae2a1f35eaed1b266e8c3e0ee271854016f4f50676efe4" protocol=ttrpc version=3 Sep 12 22:11:57.828834 containerd[1538]: time="2025-09-12T22:11:57.828786418Z" level=info msg="connecting to shim cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b" address="unix:///run/containerd/s/bd5ac32654afd948c9f7947e23e78fc3e204c3d46a94e2700712a8bd126f1b7e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:57.860930 kubelet[2675]: I0912 22:11:57.860245 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-5bs6c" podStartSLOduration=34.860227878 podStartE2EDuration="34.860227878s" podCreationTimestamp="2025-09-12 22:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:11:57.854427919 +0000 UTC m=+41.308440922" watchObservedRunningTime="2025-09-12 22:11:57.860227878 +0000 UTC m=+41.314240881" Sep 12 22:11:57.861476 systemd[1]: Started cri-containerd-cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b.scope - libcontainer container cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b. Sep 12 22:11:57.862964 systemd[1]: Started cri-containerd-d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71.scope - libcontainer container d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71. Sep 12 22:11:57.891191 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:57.919551 systemd-networkd[1442]: cali32a7e9ad97f: Gained IPv6LL Sep 12 22:11:57.983033 systemd-networkd[1442]: calia7491bbdebe: Gained IPv6LL Sep 12 22:11:57.987681 containerd[1538]: time="2025-09-12T22:11:57.987637657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-ml47p,Uid:f1c7757e-adcd-466f-b82d-0cf29ffde430,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b\"" Sep 12 22:11:57.991144 containerd[1538]: time="2025-09-12T22:11:57.991106039Z" level=info msg="StartContainer for \"d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71\" returns successfully" Sep 12 22:11:58.650116 containerd[1538]: time="2025-09-12T22:11:58.650071836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-d5jhq,Uid:d3cfc81b-9d51-4ae0-897e-a4defc4863b6,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:11:58.749604 systemd-networkd[1442]: cali5d3fd32f24b: Link UP Sep 12 22:11:58.750081 systemd-networkd[1442]: cali5d3fd32f24b: Gained carrier Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.689 [INFO][4858] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0 calico-apiserver-7d8f7d4b54- calico-apiserver d3cfc81b-9d51-4ae0-897e-a4defc4863b6 778 0 2025-09-12 22:11:33 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d8f7d4b54 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7d8f7d4b54-d5jhq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5d3fd32f24b [] [] }} ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.689 [INFO][4858] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.711 [INFO][4873] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" HandleID="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.711 [INFO][4873] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" HandleID="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c4e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7d8f7d4b54-d5jhq", "timestamp":"2025-09-12 22:11:58.711481748 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.711 [INFO][4873] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.711 [INFO][4873] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.711 [INFO][4873] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.720 [INFO][4873] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.724 [INFO][4873] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.728 [INFO][4873] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.730 [INFO][4873] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.732 [INFO][4873] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.732 [INFO][4873] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.734 [INFO][4873] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.737 [INFO][4873] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.744 [INFO][4873] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.744 [INFO][4873] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" host="localhost" Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.744 [INFO][4873] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:58.766396 containerd[1538]: 2025-09-12 22:11:58.744 [INFO][4873] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" HandleID="k8s-pod-network.cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Workload="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.746 [INFO][4858] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0", GenerateName:"calico-apiserver-7d8f7d4b54-", Namespace:"calico-apiserver", SelfLink:"", UID:"d3cfc81b-9d51-4ae0-897e-a4defc4863b6", ResourceVersion:"778", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f7d4b54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7d8f7d4b54-d5jhq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5d3fd32f24b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.747 [INFO][4858] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.747 [INFO][4858] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d3fd32f24b ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.750 [INFO][4858] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.751 [INFO][4858] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0", GenerateName:"calico-apiserver-7d8f7d4b54-", Namespace:"calico-apiserver", SelfLink:"", UID:"d3cfc81b-9d51-4ae0-897e-a4defc4863b6", ResourceVersion:"778", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f7d4b54", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb", Pod:"calico-apiserver-7d8f7d4b54-d5jhq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5d3fd32f24b", MAC:"fe:e1:94:33:14:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:58.767181 containerd[1538]: 2025-09-12 22:11:58.762 [INFO][4858] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f7d4b54-d5jhq" WorkloadEndpoint="localhost-k8s-calico--apiserver--7d8f7d4b54--d5jhq-eth0" Sep 12 22:11:58.786849 containerd[1538]: time="2025-09-12T22:11:58.786765468Z" level=info msg="connecting to shim cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb" address="unix:///run/containerd/s/13e0968fb6d4e540f1ba226c068681a236eae0e6eb6a77ee111f2572c42ec00f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:58.816090 systemd[1]: Started cri-containerd-cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb.scope - libcontainer container cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb. Sep 12 22:11:58.828020 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:58.851440 containerd[1538]: time="2025-09-12T22:11:58.851400646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f7d4b54-d5jhq,Uid:d3cfc81b-9d51-4ae0-897e-a4defc4863b6,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb\"" Sep 12 22:11:58.942179 systemd-networkd[1442]: calic60454522a0: Gained IPv6LL Sep 12 22:11:59.112973 containerd[1538]: time="2025-09-12T22:11:59.112931155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.114163 containerd[1538]: time="2025-09-12T22:11:59.114132681Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 22:11:59.115094 containerd[1538]: time="2025-09-12T22:11:59.115062538Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.117506 containerd[1538]: time="2025-09-12T22:11:59.117469310Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.118626 containerd[1538]: time="2025-09-12T22:11:59.118593068Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.316143268s" Sep 12 22:11:59.118626 containerd[1538]: time="2025-09-12T22:11:59.118624031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 22:11:59.119798 containerd[1538]: time="2025-09-12T22:11:59.119565970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:11:59.121844 containerd[1538]: time="2025-09-12T22:11:59.121808605Z" level=info msg="CreateContainer within sandbox \"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:11:59.129272 containerd[1538]: time="2025-09-12T22:11:59.128177513Z" level=info msg="Container 6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:59.137144 containerd[1538]: time="2025-09-12T22:11:59.137109369Z" level=info msg="CreateContainer within sandbox \"47b42c3e9bb36422ba4ae5dfdbf2bc5341ad6c263e31e32f9b1d8dcc84aa93c0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932\"" Sep 12 22:11:59.137588 containerd[1538]: time="2025-09-12T22:11:59.137571737Z" level=info msg="StartContainer for \"6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932\"" Sep 12 22:11:59.139081 containerd[1538]: time="2025-09-12T22:11:59.139055853Z" level=info msg="connecting to shim 6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932" address="unix:///run/containerd/s/c979d7e9afbfa1f32580e048b659a7a9b2034da6e6721116dced2560786ffcfe" protocol=ttrpc version=3 Sep 12 22:11:59.158064 systemd[1]: Started cri-containerd-6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932.scope - libcontainer container 6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932. Sep 12 22:11:59.191762 containerd[1538]: time="2025-09-12T22:11:59.191722733Z" level=info msg="StartContainer for \"6721e9ac19f17eeef733960f5a4d330a615aa3f4ed9bf9ed755f513cae24b932\" returns successfully" Sep 12 22:11:59.649603 containerd[1538]: time="2025-09-12T22:11:59.649563278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j667g,Uid:632413b7-a3b7-4949-8739-aa313e428d16,Namespace:kube-system,Attempt:0,}" Sep 12 22:11:59.739513 kubelet[2675]: I0912 22:11:59.739471 2675 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:11:59.757834 kubelet[2675]: I0912 22:11:59.757780 2675 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:11:59.835225 systemd-networkd[1442]: calif65b964c642: Link UP Sep 12 22:11:59.836558 systemd-networkd[1442]: calif65b964c642: Gained carrier Sep 12 22:11:59.845884 kubelet[2675]: I0912 22:11:59.845819 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58b5c55b9-8kvns" podStartSLOduration=21.970939217 podStartE2EDuration="23.845799325s" podCreationTimestamp="2025-09-12 22:11:36 +0000 UTC" firstStartedPulling="2025-09-12 22:11:55.925437255 +0000 UTC m=+39.379450258" lastFinishedPulling="2025-09-12 22:11:57.800297363 +0000 UTC m=+41.254310366" observedRunningTime="2025-09-12 22:11:58.858083003 +0000 UTC m=+42.312096006" watchObservedRunningTime="2025-09-12 22:11:59.845799325 +0000 UTC m=+43.299812328" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.754 [INFO][4978] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--j667g-eth0 coredns-668d6bf9bc- kube-system 632413b7-a3b7-4949-8739-aa313e428d16 781 0 2025-09-12 22:11:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-j667g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif65b964c642 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.754 [INFO][4978] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.788 [INFO][4996] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" HandleID="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Workload="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.788 [INFO][4996] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" HandleID="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Workload="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136690), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-j667g", "timestamp":"2025-09-12 22:11:59.788464515 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.788 [INFO][4996] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.788 [INFO][4996] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.788 [INFO][4996] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.798 [INFO][4996] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.804 [INFO][4996] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.810 [INFO][4996] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.813 [INFO][4996] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.819 [INFO][4996] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.819 [INFO][4996] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.821 [INFO][4996] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502 Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.824 [INFO][4996] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.830 [INFO][4996] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.830 [INFO][4996] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" host="localhost" Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.830 [INFO][4996] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:11:59.854327 containerd[1538]: 2025-09-12 22:11:59.830 [INFO][4996] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" HandleID="k8s-pod-network.2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Workload="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.833 [INFO][4978] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--j667g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"632413b7-a3b7-4949-8739-aa313e428d16", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-j667g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif65b964c642", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.833 [INFO][4978] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.833 [INFO][4978] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif65b964c642 ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.836 [INFO][4978] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.836 [INFO][4978] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--j667g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"632413b7-a3b7-4949-8739-aa313e428d16", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 11, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502", Pod:"coredns-668d6bf9bc-j667g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif65b964c642", MAC:"7e:8b:d3:77:d2:28", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:11:59.854891 containerd[1538]: 2025-09-12 22:11:59.845 [INFO][4978] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" Namespace="kube-system" Pod="coredns-668d6bf9bc-j667g" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j667g-eth0" Sep 12 22:11:59.868120 kubelet[2675]: I0912 22:11:59.867358 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xhvwr" podStartSLOduration=19.618439183 podStartE2EDuration="23.867338902s" podCreationTimestamp="2025-09-12 22:11:36 +0000 UTC" firstStartedPulling="2025-09-12 22:11:54.870529197 +0000 UTC m=+38.324542200" lastFinishedPulling="2025-09-12 22:11:59.119428956 +0000 UTC m=+42.573441919" observedRunningTime="2025-09-12 22:11:59.867305058 +0000 UTC m=+43.321318061" watchObservedRunningTime="2025-09-12 22:11:59.867338902 +0000 UTC m=+43.321351905" Sep 12 22:11:59.884307 containerd[1538]: time="2025-09-12T22:11:59.884248074Z" level=info msg="connecting to shim 2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502" address="unix:///run/containerd/s/097a218e831d3b717219698161b9dcc115f360b348a1aeb5d311c43626a2af12" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:11:59.905061 systemd[1]: Started cri-containerd-2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502.scope - libcontainer container 2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502. Sep 12 22:11:59.914093 containerd[1538]: time="2025-09-12T22:11:59.913948147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71\" id:\"76812b8d90bf82d99d9b063868427c10641d719b2eae5b4b19449494cabc767c\" pid:5038 exited_at:{seconds:1757715119 nanos:913609152}" Sep 12 22:11:59.922327 systemd-resolved[1355]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:11:59.956356 containerd[1538]: time="2025-09-12T22:11:59.956307027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j667g,Uid:632413b7-a3b7-4949-8739-aa313e428d16,Namespace:kube-system,Attempt:0,} returns sandbox id \"2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502\"" Sep 12 22:11:59.959531 containerd[1538]: time="2025-09-12T22:11:59.959133763Z" level=info msg="CreateContainer within sandbox \"2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:11:59.966896 containerd[1538]: time="2025-09-12T22:11:59.966851572Z" level=info msg="Container 300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:11:59.971411 containerd[1538]: time="2025-09-12T22:11:59.971377086Z" level=info msg="CreateContainer within sandbox \"2a5a431aeec0eac0d90586792dcf151d810425012f02bb1dc58fc9f762b63502\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd\"" Sep 12 22:11:59.971929 containerd[1538]: time="2025-09-12T22:11:59.971836894Z" level=info msg="StartContainer for \"300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd\"" Sep 12 22:11:59.972722 containerd[1538]: time="2025-09-12T22:11:59.972699185Z" level=info msg="connecting to shim 300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd" address="unix:///run/containerd/s/097a218e831d3b717219698161b9dcc115f360b348a1aeb5d311c43626a2af12" protocol=ttrpc version=3 Sep 12 22:11:59.989087 systemd[1]: Started cri-containerd-300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd.scope - libcontainer container 300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd. Sep 12 22:12:00.013970 containerd[1538]: time="2025-09-12T22:12:00.013874549Z" level=info msg="StartContainer for \"300e7bb19c33b52b1695501b2c28eabed7af801d4016d9c8f4e7c74f0018f1cd\" returns successfully" Sep 12 22:12:00.089155 systemd[1]: Started sshd@7-10.0.0.61:22-10.0.0.1:33248.service - OpenSSH per-connection server daemon (10.0.0.1:33248). Sep 12 22:12:00.158472 systemd-networkd[1442]: cali5d3fd32f24b: Gained IPv6LL Sep 12 22:12:00.163759 sshd[5121]: Accepted publickey for core from 10.0.0.1 port 33248 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:00.165382 sshd-session[5121]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:00.169371 systemd-logind[1508]: New session 8 of user core. Sep 12 22:12:00.178089 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:12:00.480205 sshd[5124]: Connection closed by 10.0.0.1 port 33248 Sep 12 22:12:00.481494 sshd-session[5121]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:00.484654 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:12:00.485529 systemd[1]: sshd@7-10.0.0.61:22-10.0.0.1:33248.service: Deactivated successfully. Sep 12 22:12:00.489990 systemd-logind[1508]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:12:00.491855 systemd-logind[1508]: Removed session 8. Sep 12 22:12:00.722627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3733595796.mount: Deactivated successfully. Sep 12 22:12:00.882434 kubelet[2675]: I0912 22:12:00.881802 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-j667g" podStartSLOduration=37.881782814 podStartE2EDuration="37.881782814s" podCreationTimestamp="2025-09-12 22:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:00.880981892 +0000 UTC m=+44.334994895" watchObservedRunningTime="2025-09-12 22:12:00.881782814 +0000 UTC m=+44.335795777" Sep 12 22:12:01.050690 containerd[1538]: time="2025-09-12T22:12:01.050647402Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:01.051587 containerd[1538]: time="2025-09-12T22:12:01.051552012Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 22:12:01.053403 containerd[1538]: time="2025-09-12T22:12:01.052549872Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:01.055063 containerd[1538]: time="2025-09-12T22:12:01.055036562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:01.055693 containerd[1538]: time="2025-09-12T22:12:01.055672105Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 1.936073732s" Sep 12 22:12:01.055780 containerd[1538]: time="2025-09-12T22:12:01.055766275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 22:12:01.056642 containerd[1538]: time="2025-09-12T22:12:01.056618560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:12:01.057578 containerd[1538]: time="2025-09-12T22:12:01.057537972Z" level=info msg="CreateContainer within sandbox \"b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:12:01.072137 containerd[1538]: time="2025-09-12T22:12:01.072097111Z" level=info msg="Container 7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:01.079295 containerd[1538]: time="2025-09-12T22:12:01.079261749Z" level=info msg="CreateContainer within sandbox \"b19dbfabcb4ba4e3a7c4cc7aaeb8b0ef400a5cb0cc106ce17cafe9e0a8ab0d91\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\"" Sep 12 22:12:01.079720 containerd[1538]: time="2025-09-12T22:12:01.079674631Z" level=info msg="StartContainer for \"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\"" Sep 12 22:12:01.082102 containerd[1538]: time="2025-09-12T22:12:01.082070511Z" level=info msg="connecting to shim 7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a" address="unix:///run/containerd/s/dda6ceebeaa4e7dfe8cf87f7e1358513eac79aca2364c158de071fbaa514faa6" protocol=ttrpc version=3 Sep 12 22:12:01.108107 systemd[1]: Started cri-containerd-7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a.scope - libcontainer container 7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a. Sep 12 22:12:01.147457 containerd[1538]: time="2025-09-12T22:12:01.147423660Z" level=info msg="StartContainer for \"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\" returns successfully" Sep 12 22:12:01.886154 systemd-networkd[1442]: calif65b964c642: Gained IPv6LL Sep 12 22:12:01.894019 kubelet[2675]: I0912 22:12:01.891659 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-sh7cj" podStartSLOduration=21.908476545 podStartE2EDuration="25.89163836s" podCreationTimestamp="2025-09-12 22:11:36 +0000 UTC" firstStartedPulling="2025-09-12 22:11:57.073329772 +0000 UTC m=+40.527342775" lastFinishedPulling="2025-09-12 22:12:01.056491587 +0000 UTC m=+44.510504590" observedRunningTime="2025-09-12 22:12:01.874733986 +0000 UTC m=+45.328746989" watchObservedRunningTime="2025-09-12 22:12:01.89163836 +0000 UTC m=+45.345651323" Sep 12 22:12:02.952779 containerd[1538]: time="2025-09-12T22:12:02.952687268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\" id:\"20d5d3a7997ef4ffeab41da0d39e6bcb5493ac6066a2379e3b404d2af3c84487\" pid:5208 exit_status:1 exited_at:{seconds:1757715122 nanos:952241504}" Sep 12 22:12:03.544708 containerd[1538]: time="2025-09-12T22:12:03.544661337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:03.546138 containerd[1538]: time="2025-09-12T22:12:03.546054190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 22:12:03.547324 containerd[1538]: time="2025-09-12T22:12:03.547296310Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:03.549777 containerd[1538]: time="2025-09-12T22:12:03.549725184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:03.550499 containerd[1538]: time="2025-09-12T22:12:03.550284677Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.493635755s" Sep 12 22:12:03.550499 containerd[1538]: time="2025-09-12T22:12:03.550315400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:12:03.551474 containerd[1538]: time="2025-09-12T22:12:03.551443389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:12:03.553209 containerd[1538]: time="2025-09-12T22:12:03.552777037Z" level=info msg="CreateContainer within sandbox \"cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:12:03.559192 containerd[1538]: time="2025-09-12T22:12:03.559157651Z" level=info msg="Container a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:03.569215 containerd[1538]: time="2025-09-12T22:12:03.569175414Z" level=info msg="CreateContainer within sandbox \"cb18549aab7500730bd527517e19302ce9feb1604159ce8e7356653963b9175b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d\"" Sep 12 22:12:03.571074 containerd[1538]: time="2025-09-12T22:12:03.571043554Z" level=info msg="StartContainer for \"a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d\"" Sep 12 22:12:03.573658 containerd[1538]: time="2025-09-12T22:12:03.573611121Z" level=info msg="connecting to shim a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d" address="unix:///run/containerd/s/bd5ac32654afd948c9f7947e23e78fc3e204c3d46a94e2700712a8bd126f1b7e" protocol=ttrpc version=3 Sep 12 22:12:03.601180 systemd[1]: Started cri-containerd-a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d.scope - libcontainer container a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d. Sep 12 22:12:03.667777 containerd[1538]: time="2025-09-12T22:12:03.667739694Z" level=info msg="StartContainer for \"a41a16c72ecf7b424fc4f192acd84d873473e43b592e821ad2d46dce472b999d\" returns successfully" Sep 12 22:12:03.823172 containerd[1538]: time="2025-09-12T22:12:03.822653593Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:03.823316 containerd[1538]: time="2025-09-12T22:12:03.823182603Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:12:03.825942 containerd[1538]: time="2025-09-12T22:12:03.825861861Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 274.377589ms" Sep 12 22:12:03.825942 containerd[1538]: time="2025-09-12T22:12:03.825902585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:12:03.830503 containerd[1538]: time="2025-09-12T22:12:03.830284406Z" level=info msg="CreateContainer within sandbox \"cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:12:03.838096 containerd[1538]: time="2025-09-12T22:12:03.838041512Z" level=info msg="Container f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:03.845166 containerd[1538]: time="2025-09-12T22:12:03.845098711Z" level=info msg="CreateContainer within sandbox \"cc0ae63772d53c7fb4503e21d977830796946507b0c27f23862d72225239a4bb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6\"" Sep 12 22:12:03.845841 containerd[1538]: time="2025-09-12T22:12:03.845815340Z" level=info msg="StartContainer for \"f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6\"" Sep 12 22:12:03.847066 containerd[1538]: time="2025-09-12T22:12:03.847033617Z" level=info msg="connecting to shim f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6" address="unix:///run/containerd/s/13e0968fb6d4e540f1ba226c068681a236eae0e6eb6a77ee111f2572c42ec00f" protocol=ttrpc version=3 Sep 12 22:12:03.870212 systemd[1]: Started cri-containerd-f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6.scope - libcontainer container f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6. Sep 12 22:12:03.964720 containerd[1538]: time="2025-09-12T22:12:03.964654490Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\" id:\"1fcaab3d7d971958cc498d9f711f5920cde43abc897af3e6821edef5954ef0cf\" pid:5296 exit_status:1 exited_at:{seconds:1757715123 nanos:963247314}" Sep 12 22:12:04.022473 containerd[1538]: time="2025-09-12T22:12:04.022439208Z" level=info msg="StartContainer for \"f994246458890c47b7f74f905e5ee057bb133c2572804c0f8fbb62626dd884f6\" returns successfully" Sep 12 22:12:04.874800 kubelet[2675]: I0912 22:12:04.874763 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:04.886684 kubelet[2675]: I0912 22:12:04.886581 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-d5jhq" podStartSLOduration=26.914566443 podStartE2EDuration="31.886562572s" podCreationTimestamp="2025-09-12 22:11:33 +0000 UTC" firstStartedPulling="2025-09-12 22:11:58.855760314 +0000 UTC m=+42.309773317" lastFinishedPulling="2025-09-12 22:12:03.827756443 +0000 UTC m=+47.281769446" observedRunningTime="2025-09-12 22:12:04.885835224 +0000 UTC m=+48.339848227" watchObservedRunningTime="2025-09-12 22:12:04.886562572 +0000 UTC m=+48.340575535" Sep 12 22:12:04.889023 kubelet[2675]: I0912 22:12:04.888973 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d8f7d4b54-ml47p" podStartSLOduration=26.325860486 podStartE2EDuration="31.887563787s" podCreationTimestamp="2025-09-12 22:11:33 +0000 UTC" firstStartedPulling="2025-09-12 22:11:57.989633237 +0000 UTC m=+41.443646200" lastFinishedPulling="2025-09-12 22:12:03.551336498 +0000 UTC m=+47.005349501" observedRunningTime="2025-09-12 22:12:03.884373568 +0000 UTC m=+47.338386571" watchObservedRunningTime="2025-09-12 22:12:04.887563787 +0000 UTC m=+48.341576790" Sep 12 22:12:05.494769 systemd[1]: Started sshd@8-10.0.0.61:22-10.0.0.1:33256.service - OpenSSH per-connection server daemon (10.0.0.1:33256). Sep 12 22:12:05.587617 sshd[5340]: Accepted publickey for core from 10.0.0.1 port 33256 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:05.590297 sshd-session[5340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:05.594958 systemd-logind[1508]: New session 9 of user core. Sep 12 22:12:05.610108 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:12:05.814871 sshd[5343]: Connection closed by 10.0.0.1 port 33256 Sep 12 22:12:05.815130 sshd-session[5340]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:05.818976 systemd-logind[1508]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:12:05.819158 systemd[1]: sshd@8-10.0.0.61:22-10.0.0.1:33256.service: Deactivated successfully. Sep 12 22:12:05.820861 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:12:05.822376 systemd-logind[1508]: Removed session 9. Sep 12 22:12:05.877274 kubelet[2675]: I0912 22:12:05.877230 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:10.313733 kubelet[2675]: I0912 22:12:10.313621 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:10.834745 systemd[1]: Started sshd@9-10.0.0.61:22-10.0.0.1:51084.service - OpenSSH per-connection server daemon (10.0.0.1:51084). Sep 12 22:12:10.897762 sshd[5366]: Accepted publickey for core from 10.0.0.1 port 51084 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:10.899779 sshd-session[5366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:10.904167 systemd-logind[1508]: New session 10 of user core. Sep 12 22:12:10.914101 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:12:11.099883 sshd[5369]: Connection closed by 10.0.0.1 port 51084 Sep 12 22:12:11.100460 sshd-session[5366]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:11.108693 systemd[1]: sshd@9-10.0.0.61:22-10.0.0.1:51084.service: Deactivated successfully. Sep 12 22:12:11.111590 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:12:11.114123 systemd-logind[1508]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:12:11.119035 systemd[1]: Started sshd@10-10.0.0.61:22-10.0.0.1:51100.service - OpenSSH per-connection server daemon (10.0.0.1:51100). Sep 12 22:12:11.121871 systemd-logind[1508]: Removed session 10. Sep 12 22:12:11.186284 sshd[5383]: Accepted publickey for core from 10.0.0.1 port 51100 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:11.187820 sshd-session[5383]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:11.194806 systemd-logind[1508]: New session 11 of user core. Sep 12 22:12:11.206098 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:12:11.416806 sshd[5386]: Connection closed by 10.0.0.1 port 51100 Sep 12 22:12:11.418158 sshd-session[5383]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:11.431636 systemd[1]: sshd@10-10.0.0.61:22-10.0.0.1:51100.service: Deactivated successfully. Sep 12 22:12:11.434845 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:12:11.437305 systemd-logind[1508]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:12:11.443279 systemd[1]: Started sshd@11-10.0.0.61:22-10.0.0.1:51116.service - OpenSSH per-connection server daemon (10.0.0.1:51116). Sep 12 22:12:11.444545 systemd-logind[1508]: Removed session 11. Sep 12 22:12:11.502233 sshd[5397]: Accepted publickey for core from 10.0.0.1 port 51116 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:11.503624 sshd-session[5397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:11.508650 systemd-logind[1508]: New session 12 of user core. Sep 12 22:12:11.518121 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:12:11.710375 sshd[5400]: Connection closed by 10.0.0.1 port 51116 Sep 12 22:12:11.711055 sshd-session[5397]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:11.714889 systemd[1]: sshd@11-10.0.0.61:22-10.0.0.1:51116.service: Deactivated successfully. Sep 12 22:12:11.717113 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:12:11.719245 systemd-logind[1508]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:12:11.720738 systemd-logind[1508]: Removed session 12. Sep 12 22:12:16.727468 systemd[1]: Started sshd@12-10.0.0.61:22-10.0.0.1:51132.service - OpenSSH per-connection server daemon (10.0.0.1:51132). Sep 12 22:12:16.802891 sshd[5424]: Accepted publickey for core from 10.0.0.1 port 51132 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:16.804608 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:16.810216 systemd-logind[1508]: New session 13 of user core. Sep 12 22:12:16.829150 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:12:17.023057 sshd[5427]: Connection closed by 10.0.0.1 port 51132 Sep 12 22:12:17.022449 sshd-session[5424]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:17.033351 systemd[1]: sshd@12-10.0.0.61:22-10.0.0.1:51132.service: Deactivated successfully. Sep 12 22:12:17.035154 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:12:17.035856 systemd-logind[1508]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:12:17.038582 systemd[1]: Started sshd@13-10.0.0.61:22-10.0.0.1:51138.service - OpenSSH per-connection server daemon (10.0.0.1:51138). Sep 12 22:12:17.039845 systemd-logind[1508]: Removed session 13. Sep 12 22:12:17.095800 sshd[5440]: Accepted publickey for core from 10.0.0.1 port 51138 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:17.097308 sshd-session[5440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:17.101786 systemd-logind[1508]: New session 14 of user core. Sep 12 22:12:17.112055 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:12:17.359136 sshd[5443]: Connection closed by 10.0.0.1 port 51138 Sep 12 22:12:17.358890 sshd-session[5440]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:17.371219 systemd[1]: sshd@13-10.0.0.61:22-10.0.0.1:51138.service: Deactivated successfully. Sep 12 22:12:17.374426 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:12:17.375113 systemd-logind[1508]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:12:17.377464 systemd[1]: Started sshd@14-10.0.0.61:22-10.0.0.1:51148.service - OpenSSH per-connection server daemon (10.0.0.1:51148). Sep 12 22:12:17.378043 systemd-logind[1508]: Removed session 14. Sep 12 22:12:17.428935 sshd[5456]: Accepted publickey for core from 10.0.0.1 port 51148 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:17.430179 sshd-session[5456]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:17.433947 systemd-logind[1508]: New session 15 of user core. Sep 12 22:12:17.440071 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:12:18.050776 sshd[5459]: Connection closed by 10.0.0.1 port 51148 Sep 12 22:12:18.051625 sshd-session[5456]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:18.061693 systemd[1]: sshd@14-10.0.0.61:22-10.0.0.1:51148.service: Deactivated successfully. Sep 12 22:12:18.064077 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:12:18.064833 systemd-logind[1508]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:12:18.067057 systemd-logind[1508]: Removed session 15. Sep 12 22:12:18.072403 systemd[1]: Started sshd@15-10.0.0.61:22-10.0.0.1:51164.service - OpenSSH per-connection server daemon (10.0.0.1:51164). Sep 12 22:12:18.132700 sshd[5481]: Accepted publickey for core from 10.0.0.1 port 51164 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:18.134062 sshd-session[5481]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:18.137885 systemd-logind[1508]: New session 16 of user core. Sep 12 22:12:18.148108 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:12:18.444261 sshd[5484]: Connection closed by 10.0.0.1 port 51164 Sep 12 22:12:18.444624 sshd-session[5481]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:18.454501 systemd[1]: sshd@15-10.0.0.61:22-10.0.0.1:51164.service: Deactivated successfully. Sep 12 22:12:18.456089 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:12:18.457043 systemd-logind[1508]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:12:18.461120 systemd[1]: Started sshd@16-10.0.0.61:22-10.0.0.1:51174.service - OpenSSH per-connection server daemon (10.0.0.1:51174). Sep 12 22:12:18.466469 systemd-logind[1508]: Removed session 16. Sep 12 22:12:18.525908 sshd[5495]: Accepted publickey for core from 10.0.0.1 port 51174 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:18.527325 sshd-session[5495]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:18.531343 systemd-logind[1508]: New session 17 of user core. Sep 12 22:12:18.538074 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:12:18.662685 sshd[5498]: Connection closed by 10.0.0.1 port 51174 Sep 12 22:12:18.662529 sshd-session[5495]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:18.666580 systemd-logind[1508]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:12:18.666846 systemd[1]: sshd@16-10.0.0.61:22-10.0.0.1:51174.service: Deactivated successfully. Sep 12 22:12:18.669609 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:12:18.672116 systemd-logind[1508]: Removed session 17. Sep 12 22:12:19.864816 containerd[1538]: time="2025-09-12T22:12:19.864775347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c22941d0a6b8552fbfdeab57bfa5062f2e86d54134300b92a918eafbac6d4152\" id:\"e8343a5b19649353704b0b3f0fd10d562b03c1931db78b83c7fd46a520844f34\" pid:5521 exited_at:{seconds:1757715139 nanos:864503406}" Sep 12 22:12:23.642038 kubelet[2675]: I0912 22:12:23.641679 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:23.682458 systemd[1]: Started sshd@17-10.0.0.61:22-10.0.0.1:58286.service - OpenSSH per-connection server daemon (10.0.0.1:58286). Sep 12 22:12:23.749086 sshd[5539]: Accepted publickey for core from 10.0.0.1 port 58286 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:23.750715 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:23.754603 systemd-logind[1508]: New session 18 of user core. Sep 12 22:12:23.766471 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:12:23.920962 sshd[5543]: Connection closed by 10.0.0.1 port 58286 Sep 12 22:12:23.921152 sshd-session[5539]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:23.924296 systemd[1]: sshd@17-10.0.0.61:22-10.0.0.1:58286.service: Deactivated successfully. Sep 12 22:12:23.926059 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:12:23.927516 systemd-logind[1508]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:12:23.929674 systemd-logind[1508]: Removed session 18. Sep 12 22:12:28.931743 systemd[1]: Started sshd@18-10.0.0.61:22-10.0.0.1:58298.service - OpenSSH per-connection server daemon (10.0.0.1:58298). Sep 12 22:12:29.000079 sshd[5562]: Accepted publickey for core from 10.0.0.1 port 58298 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:29.001500 sshd-session[5562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:29.007995 systemd-logind[1508]: New session 19 of user core. Sep 12 22:12:29.019449 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:12:29.207574 sshd[5565]: Connection closed by 10.0.0.1 port 58298 Sep 12 22:12:29.207948 sshd-session[5562]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:29.212269 systemd-logind[1508]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:12:29.212427 systemd[1]: sshd@18-10.0.0.61:22-10.0.0.1:58298.service: Deactivated successfully. Sep 12 22:12:29.214298 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:12:29.215721 systemd-logind[1508]: Removed session 19. Sep 12 22:12:29.897699 containerd[1538]: time="2025-09-12T22:12:29.897572495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d539d04c0144e93da73bc16ca7a7dc26626bc33792baa5ec9db861ffb1d29e71\" id:\"5acd70fbaad6e5da8803b31cf7ebe2d67d0c1227ec75f6b03f075ea075f1be34\" pid:5590 exited_at:{seconds:1757715149 nanos:896625148}" Sep 12 22:12:33.953647 containerd[1538]: time="2025-09-12T22:12:33.953604726Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7763b9559aa873108a92da1b5458f404eddd198c7eb5bf882c40cc3ee9bdde8a\" id:\"e2cd9309c6a65b345121ed93c0eda701f6a01b8010f05556f02505a58cf130ba\" pid:5613 exited_at:{seconds:1757715153 nanos:953345617}" Sep 12 22:12:34.222607 systemd[1]: Started sshd@19-10.0.0.61:22-10.0.0.1:46190.service - OpenSSH per-connection server daemon (10.0.0.1:46190). Sep 12 22:12:34.271789 sshd[5628]: Accepted publickey for core from 10.0.0.1 port 46190 ssh2: RSA SHA256:89WB56THnhzjx8XsKgQlSeZZaxZLOzxRKY4RxNTnHBI Sep 12 22:12:34.274505 sshd-session[5628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:34.280969 systemd-logind[1508]: New session 20 of user core. Sep 12 22:12:34.293112 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:12:34.473133 sshd[5631]: Connection closed by 10.0.0.1 port 46190 Sep 12 22:12:34.472059 sshd-session[5628]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:34.478549 systemd-logind[1508]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:12:34.479318 systemd[1]: sshd@19-10.0.0.61:22-10.0.0.1:46190.service: Deactivated successfully. Sep 12 22:12:34.484288 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:12:34.487153 systemd-logind[1508]: Removed session 20.